Mar 17 00:22:04 crc systemd[1]: Starting Kubernetes Kubelet... Mar 17 00:22:04 crc restorecon[4745]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:04 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 00:22:05 crc restorecon[4745]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 00:22:05 crc kubenswrapper[4755]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.957381 4755 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.967993 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968044 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968057 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968069 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968079 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968087 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968097 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968105 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968114 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968123 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968133 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968142 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968152 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968160 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968169 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968177 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968185 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968192 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968200 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968208 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968216 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968224 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968232 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968240 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968248 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968256 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968263 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968271 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968280 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968288 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968296 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968304 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968313 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968334 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968344 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968353 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968360 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968369 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968378 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968386 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968394 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968402 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968412 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968422 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968431 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968464 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968473 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968481 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968489 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968497 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968504 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968512 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968520 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968528 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968537 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968545 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968553 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968561 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968569 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968576 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968584 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968592 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968599 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968607 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968614 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968624 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968632 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968641 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968649 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968656 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.968665 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968810 4755 flags.go:64] FLAG: --address="0.0.0.0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968828 4755 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968843 4755 flags.go:64] FLAG: --anonymous-auth="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968854 4755 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968865 4755 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968875 4755 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968887 4755 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968904 4755 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968913 4755 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968923 4755 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968932 4755 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968944 4755 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968954 4755 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968963 4755 flags.go:64] FLAG: --cgroup-root="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968972 4755 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968981 4755 flags.go:64] FLAG: --client-ca-file="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968990 4755 flags.go:64] FLAG: --cloud-config="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.968999 4755 flags.go:64] FLAG: --cloud-provider="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969008 4755 flags.go:64] FLAG: --cluster-dns="[]" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969019 4755 flags.go:64] FLAG: --cluster-domain="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969028 4755 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969037 4755 flags.go:64] FLAG: --config-dir="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969046 4755 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969056 4755 flags.go:64] FLAG: --container-log-max-files="5" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969068 4755 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969077 4755 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969086 4755 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969096 4755 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969105 4755 flags.go:64] FLAG: --contention-profiling="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969114 4755 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969123 4755 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969132 4755 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969142 4755 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969153 4755 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969162 4755 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969171 4755 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969180 4755 flags.go:64] FLAG: --enable-load-reader="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969189 4755 flags.go:64] FLAG: --enable-server="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969198 4755 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969210 4755 flags.go:64] FLAG: --event-burst="100" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969219 4755 flags.go:64] FLAG: --event-qps="50" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969228 4755 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969237 4755 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969247 4755 flags.go:64] FLAG: --eviction-hard="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969257 4755 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969267 4755 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969275 4755 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969288 4755 flags.go:64] FLAG: --eviction-soft="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969307 4755 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969316 4755 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969325 4755 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969333 4755 flags.go:64] FLAG: --experimental-mounter-path="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969342 4755 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969353 4755 flags.go:64] FLAG: --fail-swap-on="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969362 4755 flags.go:64] FLAG: --feature-gates="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969381 4755 flags.go:64] FLAG: --file-check-frequency="20s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969391 4755 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969400 4755 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969410 4755 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969419 4755 flags.go:64] FLAG: --healthz-port="10248" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969429 4755 flags.go:64] FLAG: --help="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969465 4755 flags.go:64] FLAG: --hostname-override="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969475 4755 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969485 4755 flags.go:64] FLAG: --http-check-frequency="20s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969494 4755 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969503 4755 flags.go:64] FLAG: --image-credential-provider-config="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969512 4755 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969521 4755 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969530 4755 flags.go:64] FLAG: --image-service-endpoint="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969539 4755 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969548 4755 flags.go:64] FLAG: --kube-api-burst="100" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969557 4755 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969567 4755 flags.go:64] FLAG: --kube-api-qps="50" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969576 4755 flags.go:64] FLAG: --kube-reserved="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969586 4755 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969595 4755 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969604 4755 flags.go:64] FLAG: --kubelet-cgroups="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969613 4755 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969623 4755 flags.go:64] FLAG: --lock-file="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969632 4755 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969641 4755 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969651 4755 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969664 4755 flags.go:64] FLAG: --log-json-split-stream="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969674 4755 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969713 4755 flags.go:64] FLAG: --log-text-split-stream="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969722 4755 flags.go:64] FLAG: --logging-format="text" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969731 4755 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969741 4755 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969750 4755 flags.go:64] FLAG: --manifest-url="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969759 4755 flags.go:64] FLAG: --manifest-url-header="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969771 4755 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969780 4755 flags.go:64] FLAG: --max-open-files="1000000" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969791 4755 flags.go:64] FLAG: --max-pods="110" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969800 4755 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969809 4755 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969820 4755 flags.go:64] FLAG: --memory-manager-policy="None" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969829 4755 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969838 4755 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969847 4755 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969856 4755 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969876 4755 flags.go:64] FLAG: --node-status-max-images="50" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969885 4755 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969894 4755 flags.go:64] FLAG: --oom-score-adj="-999" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969904 4755 flags.go:64] FLAG: --pod-cidr="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969912 4755 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969925 4755 flags.go:64] FLAG: --pod-manifest-path="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969933 4755 flags.go:64] FLAG: --pod-max-pids="-1" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969943 4755 flags.go:64] FLAG: --pods-per-core="0" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969951 4755 flags.go:64] FLAG: --port="10250" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969960 4755 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969969 4755 flags.go:64] FLAG: --provider-id="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969979 4755 flags.go:64] FLAG: --qos-reserved="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969989 4755 flags.go:64] FLAG: --read-only-port="10255" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.969998 4755 flags.go:64] FLAG: --register-node="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970007 4755 flags.go:64] FLAG: --register-schedulable="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970016 4755 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970031 4755 flags.go:64] FLAG: --registry-burst="10" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970040 4755 flags.go:64] FLAG: --registry-qps="5" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970049 4755 flags.go:64] FLAG: --reserved-cpus="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970061 4755 flags.go:64] FLAG: --reserved-memory="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970072 4755 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970081 4755 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970092 4755 flags.go:64] FLAG: --rotate-certificates="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970101 4755 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970110 4755 flags.go:64] FLAG: --runonce="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970118 4755 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970128 4755 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970138 4755 flags.go:64] FLAG: --seccomp-default="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970147 4755 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970156 4755 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970165 4755 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970175 4755 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970184 4755 flags.go:64] FLAG: --storage-driver-password="root" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970193 4755 flags.go:64] FLAG: --storage-driver-secure="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970202 4755 flags.go:64] FLAG: --storage-driver-table="stats" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970210 4755 flags.go:64] FLAG: --storage-driver-user="root" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970219 4755 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970229 4755 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970238 4755 flags.go:64] FLAG: --system-cgroups="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970247 4755 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970262 4755 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970271 4755 flags.go:64] FLAG: --tls-cert-file="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970279 4755 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970291 4755 flags.go:64] FLAG: --tls-min-version="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970299 4755 flags.go:64] FLAG: --tls-private-key-file="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970308 4755 flags.go:64] FLAG: --topology-manager-policy="none" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970318 4755 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970333 4755 flags.go:64] FLAG: --topology-manager-scope="container" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970342 4755 flags.go:64] FLAG: --v="2" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970353 4755 flags.go:64] FLAG: --version="false" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970364 4755 flags.go:64] FLAG: --vmodule="" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970375 4755 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.970384 4755 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970612 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970624 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970634 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970643 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970653 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970661 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970669 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970678 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970686 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970694 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970702 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970710 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970718 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970725 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970733 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970744 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970753 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970762 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970771 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970779 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970788 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970796 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970805 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970813 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970821 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970829 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970840 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970848 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970858 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970867 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970876 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970884 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970892 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970900 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970907 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970916 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970924 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970931 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970940 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970948 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970958 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970968 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.970989 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971002 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971012 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971022 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971032 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971040 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971048 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971056 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971063 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971072 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971079 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971087 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971094 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971102 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971110 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971118 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971130 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971138 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971146 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971154 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971162 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971172 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971182 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971191 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971200 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971208 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971217 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971224 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.971234 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.971260 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.983643 4755 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.983697 4755 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983848 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983872 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983882 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983891 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983900 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983909 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983917 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983925 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983933 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983943 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983951 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983959 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983971 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983982 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.983991 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984002 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984011 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984020 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984029 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984037 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984046 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984054 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984062 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984070 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984077 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984085 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984093 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984100 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984108 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984117 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984125 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984132 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984143 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984152 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984161 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984169 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984177 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984184 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984192 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984202 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984212 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984222 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984233 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984243 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984254 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984264 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984273 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984284 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984293 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984303 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984314 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984324 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984334 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984343 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984352 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984360 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984368 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984376 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984383 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984391 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984399 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984407 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984414 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984421 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984429 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984481 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984489 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984497 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984508 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984518 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984528 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.984541 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984822 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984835 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984846 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984857 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984865 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984874 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984882 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984890 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984898 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984906 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984917 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984927 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984937 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984946 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984954 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984962 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984984 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.984993 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985001 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985009 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985017 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985024 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985032 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985039 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985047 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985055 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985062 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985070 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985077 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985086 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985094 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985102 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985109 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985116 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985124 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985132 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985139 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985147 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985154 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985162 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985169 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985177 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985184 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985194 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985202 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985210 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985217 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985226 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985234 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985242 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985249 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985257 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985265 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985272 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985280 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985288 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985295 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985303 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985310 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985318 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985326 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985333 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985341 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985349 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985356 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985371 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985381 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985390 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985399 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985408 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 00:22:05 crc kubenswrapper[4755]: W0317 00:22:05.985416 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.985428 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.985739 4755 server.go:940] "Client rotation is on, will bootstrap in background" Mar 17 00:22:05 crc kubenswrapper[4755]: E0317 00:22:05.993101 4755 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.998202 4755 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 17 00:22:05 crc kubenswrapper[4755]: I0317 00:22:05.998332 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.000754 4755 server.go:997] "Starting client certificate rotation" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.000808 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.001024 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.032315 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.036026 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.036141 4755 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.051951 4755 log.go:25] "Validated CRI v1 runtime API" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.099839 4755 log.go:25] "Validated CRI v1 image API" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.102052 4755 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.107426 4755 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-17-00-17-15-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.107480 4755 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.136822 4755 manager.go:217] Machine: {Timestamp:2026-03-17 00:22:06.13433237 +0000 UTC m=+0.893784713 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5f993bf0-a659-4d33-851e-45b2886560a8 BootID:1691cfa1-2188-4028-9d19-13bfee907928 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:42:03:05 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:42:03:05 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7f:05:07 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c4:fc:ae Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:77:37:66 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:e6:10 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:16:75:f4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:36:87:14:72:08:27 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:7a:a7:53:75:68 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.137692 4755 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.138111 4755 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.138873 4755 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.139303 4755 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.139557 4755 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.140025 4755 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.140135 4755 container_manager_linux.go:303] "Creating device plugin manager" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.140761 4755 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.140939 4755 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.141245 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.141499 4755 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.145673 4755 kubelet.go:418] "Attempting to sync node with API server" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.145804 4755 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.145942 4755 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.146033 4755 kubelet.go:324] "Adding apiserver pod source" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.146109 4755 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.152332 4755 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.153133 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.153354 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.153175 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.153903 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.154149 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.155464 4755 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157560 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157618 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157636 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157653 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157681 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157699 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157717 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157744 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157774 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157858 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157904 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.157923 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.158971 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.159555 4755 server.go:1280] "Started kubelet" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.159734 4755 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 00:22:06 crc systemd[1]: Started Kubernetes Kubelet. Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.167900 4755 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.168564 4755 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.170425 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.172001 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.172088 4755 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.177458 4755 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.177529 4755 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.178719 4755 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.179506 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.179609 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.180277 4755 factory.go:55] Registering systemd factory Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.180478 4755 factory.go:221] Registration of the systemd container factory successfully Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.180044 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.181289 4755 factory.go:153] Registering CRI-O factory Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.181347 4755 factory.go:221] Registration of the crio container factory successfully Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.181474 4755 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.181511 4755 factory.go:103] Registering Raw factory Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.181551 4755 manager.go:1196] Started watching for new ooms in manager Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.181910 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.183319 4755 manager.go:319] Starting recovery of all containers Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.182156 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.187049 4755 server.go:460] "Adding debug handlers to kubelet server" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197041 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197497 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197520 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197538 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197555 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197574 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197592 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197610 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197631 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197648 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197665 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197683 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197700 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197723 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197739 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197760 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197780 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197797 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197857 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197892 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197919 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197944 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197978 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.197998 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198016 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198035 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198061 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198081 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198100 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198120 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198188 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198215 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198234 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198251 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198270 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198288 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198307 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198325 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198345 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198381 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198409 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198427 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198471 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198488 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198506 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198524 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198543 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198564 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198583 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198600 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198618 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198636 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198660 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198679 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198717 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198766 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198783 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198805 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198823 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198841 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198859 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198877 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198896 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198915 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198932 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198949 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198968 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.198986 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199003 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199024 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199041 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199060 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199080 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199099 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199118 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199137 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199155 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199173 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199191 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199210 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199229 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199247 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199267 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199286 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199304 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199322 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199339 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199357 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199408 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199429 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199532 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199553 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199574 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199592 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199611 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199631 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199649 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199665 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199684 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199703 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199740 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199760 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199777 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199805 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199825 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199846 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199866 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199886 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199904 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199924 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199945 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199964 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.199983 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200003 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200031 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200050 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200069 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200087 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200105 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.200123 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.202894 4755 manager.go:324] Recovery completed Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203573 4755 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203617 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203639 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203660 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203678 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203697 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203718 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203735 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203753 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203772 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203789 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203806 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203823 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203840 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203856 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203874 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203891 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203908 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203926 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203943 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203962 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.203992 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204011 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204029 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204046 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204063 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204081 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204099 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204118 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204145 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204162 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204179 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204198 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204215 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204234 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204271 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204287 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204305 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204324 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204342 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204362 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204385 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204410 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204467 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204491 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204508 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204526 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204545 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204563 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204583 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204602 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204619 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204636 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204654 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204673 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204690 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204709 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204728 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204746 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204765 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204783 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204801 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204819 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204837 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204857 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204874 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204890 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204907 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204924 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204941 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204962 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204979 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.204997 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.205015 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.206538 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.206607 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.206928 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.206971 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207023 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207041 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207058 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207083 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207098 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207120 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207136 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207151 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207175 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207188 4755 reconstruct.go:97] "Volume reconstruction finished" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.207198 4755 reconciler.go:26] "Reconciler: start to sync state" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.214414 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.218356 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.218403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.218415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.219871 4755 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.219894 4755 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.219934 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.243852 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.246834 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.246892 4755 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.246931 4755 kubelet.go:2335] "Starting kubelet main sync loop" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.246999 4755 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.248838 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.248920 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.251207 4755 policy_none.go:49] "None policy: Start" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.252189 4755 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.252211 4755 state_mem.go:35] "Initializing new in-memory state store" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.281354 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.308329 4755 manager.go:334] "Starting Device Plugin manager" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.308378 4755 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.308405 4755 server.go:79] "Starting device plugin registration server" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.309028 4755 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.309043 4755 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.309308 4755 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.309460 4755 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.309469 4755 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.317322 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.347892 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.348039 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.349590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.349626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.349639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.349788 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350038 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350070 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350873 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.350983 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351256 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351627 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351759 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.351792 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352584 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352724 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352815 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.352833 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.353959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.353961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.354030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.354051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.353997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.354119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.354315 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.354378 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.355468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.355489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.355498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.383069 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.409205 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.409882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.409977 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410019 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410087 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410541 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.410770 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.411227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.411267 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.411290 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.411339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.411418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.411498 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512754 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512925 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512934 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512978 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.512971 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513130 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513133 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513050 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513077 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513134 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.513563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.612083 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.614151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.614216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.614235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.614267 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.614893 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.676258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.692212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.705606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.721573 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.728396 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-48f3cdfcba3e7d849a28b6d0f54fffae3c4c8e852e2a3e1c3086d379a335973b WatchSource:0}: Error finding container 48f3cdfcba3e7d849a28b6d0f54fffae3c4c8e852e2a3e1c3086d379a335973b: Status 404 returned error can't find the container with id 48f3cdfcba3e7d849a28b6d0f54fffae3c4c8e852e2a3e1c3086d379a335973b Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.730654 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d303ee8f189dcdaf84000a021aa9abefeeed4322954166e26b3b160321d136e7 WatchSource:0}: Error finding container d303ee8f189dcdaf84000a021aa9abefeeed4322954166e26b3b160321d136e7: Status 404 returned error can't find the container with id d303ee8f189dcdaf84000a021aa9abefeeed4322954166e26b3b160321d136e7 Mar 17 00:22:06 crc kubenswrapper[4755]: I0317 00:22:06.732787 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.735658 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e3f900a365d35478c3065277446bcd3cf0619b3f52db8d83ed988424cf9cf130 WatchSource:0}: Error finding container e3f900a365d35478c3065277446bcd3cf0619b3f52db8d83ed988424cf9cf130: Status 404 returned error can't find the container with id e3f900a365d35478c3065277446bcd3cf0619b3f52db8d83ed988424cf9cf130 Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.753692 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-856e7006e49e2b29d802ff15456fe9f81ec44c770419e8a31e6c3c82e52078ec WatchSource:0}: Error finding container 856e7006e49e2b29d802ff15456fe9f81ec44c770419e8a31e6c3c82e52078ec: Status 404 returned error can't find the container with id 856e7006e49e2b29d802ff15456fe9f81ec44c770419e8a31e6c3c82e52078ec Mar 17 00:22:06 crc kubenswrapper[4755]: W0317 00:22:06.755220 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a3b298ab25d78069590af654aaea5ba3d9729a0977e8c1937e4d68d0f95fe4d4 WatchSource:0}: Error finding container a3b298ab25d78069590af654aaea5ba3d9729a0977e8c1937e4d68d0f95fe4d4: Status 404 returned error can't find the container with id a3b298ab25d78069590af654aaea5ba3d9729a0977e8c1937e4d68d0f95fe4d4 Mar 17 00:22:06 crc kubenswrapper[4755]: E0317 00:22:06.783833 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.015997 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.017754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.017812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.017831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.017868 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.018464 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 17 00:22:07 crc kubenswrapper[4755]: W0317 00:22:07.088740 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.088849 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:07 crc kubenswrapper[4755]: W0317 00:22:07.142542 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.142631 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.172218 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.254324 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d303ee8f189dcdaf84000a021aa9abefeeed4322954166e26b3b160321d136e7"} Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.255661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"48f3cdfcba3e7d849a28b6d0f54fffae3c4c8e852e2a3e1c3086d379a335973b"} Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.257081 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"856e7006e49e2b29d802ff15456fe9f81ec44c770419e8a31e6c3c82e52078ec"} Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.258561 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3b298ab25d78069590af654aaea5ba3d9729a0977e8c1937e4d68d0f95fe4d4"} Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.260036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3f900a365d35478c3065277446bcd3cf0619b3f52db8d83ed988424cf9cf130"} Mar 17 00:22:07 crc kubenswrapper[4755]: W0317 00:22:07.582393 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.582502 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.585038 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 17 00:22:07 crc kubenswrapper[4755]: W0317 00:22:07.778156 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.778232 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.819583 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.821026 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.821058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.821067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:07 crc kubenswrapper[4755]: I0317 00:22:07.821088 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:07 crc kubenswrapper[4755]: E0317 00:22:07.821476 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.172245 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.212750 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:22:08 crc kubenswrapper[4755]: E0317 00:22:08.213984 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.264051 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"976fbb4e200913ee493f78d4c7f9bfbfbf9bbe14c5d7c7db73d6189a727907c5"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.264102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49d422659d1b99f78b7721ab1b1e41b8486b2b951987139a77ff415e1249c051"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.266094 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765" exitCode=0 Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.266211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.266222 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.267273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.267313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.267326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.268780 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9db026a8dabf0e352d57c8a7e79cb9e2b7c64691ad28f03e09d71246232ca71a" exitCode=0 Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.268856 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9db026a8dabf0e352d57c8a7e79cb9e2b7c64691ad28f03e09d71246232ca71a"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.269020 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.270309 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.270719 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="24d20bee13b902350cdc62c56485fad8db92f3b10c09021faee70db9a6c63ff4" exitCode=0 Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.270810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"24d20bee13b902350cdc62c56485fad8db92f3b10c09021faee70db9a6c63ff4"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.270929 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.271693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.271720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.271731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.272556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.274151 4755 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499" exitCode=0 Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.274183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499"} Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.274322 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.276058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.276342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:08 crc kubenswrapper[4755]: I0317 00:22:08.276859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.171605 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:09 crc kubenswrapper[4755]: E0317 00:22:09.186616 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.280519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.280559 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.280568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.280645 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.282607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.282650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.282663 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.285163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f68d7ed925243ceb0af951708856ef280499b00bdbc36bb06538ea108cfcf275"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.285198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8ea0f6e604f5733af968819d4dd4cf07937d5e70641be8b435fb9c1b716fe08"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.285265 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.286074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.286101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.286111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.288774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.288828 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.288842 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.288853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.290112 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c8a2f6956799d86ec9616249de2a416cedaed74be4919c090e1fa6d9046c3728" exitCode=0 Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.290157 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c8a2f6956799d86ec9616249de2a416cedaed74be4919c090e1fa6d9046c3728"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.290250 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.290988 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.291034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.291047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.292017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ed7aa4afa68bf484cfa3735c10164f24ee92eae2f1984cdc04d89b0efcacb8da"} Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.292070 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.293192 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.293221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.293233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: W0317 00:22:09.384864 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:09 crc kubenswrapper[4755]: E0317 00:22:09.384950 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.422324 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.424155 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.424197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.424209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:09 crc kubenswrapper[4755]: I0317 00:22:09.424232 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:09 crc kubenswrapper[4755]: E0317 00:22:09.424618 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 17 00:22:09 crc kubenswrapper[4755]: W0317 00:22:09.485740 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 17 00:22:09 crc kubenswrapper[4755]: E0317 00:22:09.485824 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.118742 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.299349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f925d82d4b84b0bbcfb6d1fbeb87e5671d7ccacc51e668e007a512bc1bfce1b"} Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.299501 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.300868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.300924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.300944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.302286 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2df91ba3d2151f0887818a15ed14f68492cf94f475150c49a973ab1050f3cfa2" exitCode=0 Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.302424 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.302518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2df91ba3d2151f0887818a15ed14f68492cf94f475150c49a973ab1050f3cfa2"} Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.302541 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.302683 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.303076 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304323 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304497 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.304664 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.569536 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:10 crc kubenswrapper[4755]: I0317 00:22:10.586934 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8ccfecddb2955e5f667d3eba46fb658c490c41d1038f52534f22e8aca4871480"} Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312429 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312478 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d565c137a33584ca4127581b1edc3147b9757c032cf06086383190173ba092b4"} Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c12b5e8b3dc347453ebef5fdab79b279823f964dee34f81a8a5f33b514eb868d"} Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312505 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312582 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.312612 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314367 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314572 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:11 crc kubenswrapper[4755]: I0317 00:22:11.314917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.120962 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.242514 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"026697ad55991fcdcd8a0ee7c5849d143ee84cfe366565e245c949ef880e67ea"} Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7dfd447c01eea9dfa18b341bf1226709b0ee496a4b5b5584f2997f76741d05cb"} Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321494 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321570 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321628 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.321575 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.323500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.397563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.625243 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.626344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.626370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.626379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:12 crc kubenswrapper[4755]: I0317 00:22:12.626397 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.324228 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.324297 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.325330 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.325833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.325893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.325917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326167 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.326945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.440340 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:13 crc kubenswrapper[4755]: I0317 00:22:13.737148 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.326827 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.326827 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.328891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.328936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.328955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.328991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.329034 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:14 crc kubenswrapper[4755]: I0317 00:22:14.329059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:15 crc kubenswrapper[4755]: I0317 00:22:15.161286 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 17 00:22:15 crc kubenswrapper[4755]: I0317 00:22:15.161600 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:15 crc kubenswrapper[4755]: I0317 00:22:15.163175 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:15 crc kubenswrapper[4755]: I0317 00:22:15.163221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:15 crc kubenswrapper[4755]: I0317 00:22:15.163242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:16 crc kubenswrapper[4755]: E0317 00:22:16.317546 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:16 crc kubenswrapper[4755]: I0317 00:22:16.441307 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:22:16 crc kubenswrapper[4755]: I0317 00:22:16.441407 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.287521 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.287865 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.289305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.289336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.289345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.295315 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.338532 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.339843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.339925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:18 crc kubenswrapper[4755]: I0317 00:22:18.339954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.148732 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.149014 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.150580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.150676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.150712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.656722 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 17 00:22:19 crc kubenswrapper[4755]: I0317 00:22:19.656813 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 17 00:22:20 crc kubenswrapper[4755]: W0317 00:22:20.025705 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.025843 4755 trace.go:236] Trace[318646210]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Mar-2026 00:22:10.023) (total time: 10002ms): Mar 17 00:22:20 crc kubenswrapper[4755]: Trace[318646210]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:22:20.025) Mar 17 00:22:20 crc kubenswrapper[4755]: Trace[318646210]: [10.002110338s] [10.002110338s] END Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.025876 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.171780 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 17 00:22:20 crc kubenswrapper[4755]: W0317 00:22:20.397752 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.397876 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.415543 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 00:22:20 crc kubenswrapper[4755]: W0317 00:22:20.422097 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.422244 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.426932 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.432190 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.432280 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 00:22:20 crc kubenswrapper[4755]: W0317 00:22:20.434810 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.434978 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.435410 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 17 00:22:20 crc kubenswrapper[4755]: E0317 00:22:20.439278 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.444020 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 00:22:20 crc kubenswrapper[4755]: I0317 00:22:20.444083 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.176193 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:21Z is after 2026-02-23T05:33:13Z Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.348432 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.351532 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f925d82d4b84b0bbcfb6d1fbeb87e5671d7ccacc51e668e007a512bc1bfce1b" exitCode=255 Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.351577 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8f925d82d4b84b0bbcfb6d1fbeb87e5671d7ccacc51e668e007a512bc1bfce1b"} Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.351840 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.353240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.353315 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.353340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:21 crc kubenswrapper[4755]: I0317 00:22:21.354380 4755 scope.go:117] "RemoveContainer" containerID="8f925d82d4b84b0bbcfb6d1fbeb87e5671d7ccacc51e668e007a512bc1bfce1b" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.129625 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.176226 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:22Z is after 2026-02-23T05:33:13Z Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.358199 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.359368 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.362746 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" exitCode=255 Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.362823 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3"} Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.362921 4755 scope.go:117] "RemoveContainer" containerID="8f925d82d4b84b0bbcfb6d1fbeb87e5671d7ccacc51e668e007a512bc1bfce1b" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.362947 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.364606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.364677 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.364701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.365716 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:22 crc kubenswrapper[4755]: E0317 00:22:22.366083 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:22 crc kubenswrapper[4755]: I0317 00:22:22.370567 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.176662 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:23Z is after 2026-02-23T05:33:13Z Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.320623 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.368349 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.370992 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.372132 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.372186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.372203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:23 crc kubenswrapper[4755]: I0317 00:22:23.373189 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:23 crc kubenswrapper[4755]: E0317 00:22:23.373520 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:23 crc kubenswrapper[4755]: W0317 00:22:23.937233 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:23Z is after 2026-02-23T05:33:13Z Mar 17 00:22:23 crc kubenswrapper[4755]: E0317 00:22:23.937340 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.176185 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:24Z is after 2026-02-23T05:33:13Z Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.373610 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.375570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.375622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.375639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:24 crc kubenswrapper[4755]: I0317 00:22:24.376489 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:24 crc kubenswrapper[4755]: E0317 00:22:24.376766 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:25 crc kubenswrapper[4755]: I0317 00:22:25.176623 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:25Z is after 2026-02-23T05:33:13Z Mar 17 00:22:25 crc kubenswrapper[4755]: W0317 00:22:25.933796 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:25Z is after 2026-02-23T05:33:13Z Mar 17 00:22:25 crc kubenswrapper[4755]: E0317 00:22:25.933903 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.176052 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:26Z is after 2026-02-23T05:33:13Z Mar 17 00:22:26 crc kubenswrapper[4755]: E0317 00:22:26.320939 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.441602 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.441760 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.815757 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.817421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.817507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.817526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:26 crc kubenswrapper[4755]: I0317 00:22:26.817559 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:26 crc kubenswrapper[4755]: E0317 00:22:26.820345 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 00:22:26 crc kubenswrapper[4755]: E0317 00:22:26.839301 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 00:22:27 crc kubenswrapper[4755]: I0317 00:22:27.177122 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:27Z is after 2026-02-23T05:33:13Z Mar 17 00:22:27 crc kubenswrapper[4755]: W0317 00:22:27.848665 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:27Z is after 2026-02-23T05:33:13Z Mar 17 00:22:27 crc kubenswrapper[4755]: E0317 00:22:27.848778 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:28 crc kubenswrapper[4755]: I0317 00:22:28.176529 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:28Z is after 2026-02-23T05:33:13Z Mar 17 00:22:28 crc kubenswrapper[4755]: I0317 00:22:28.655071 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:22:28 crc kubenswrapper[4755]: E0317 00:22:28.660669 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.176090 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:29Z is after 2026-02-23T05:33:13Z Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.252181 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.252521 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.254504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.254594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.254613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.272464 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.387727 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.388903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.389090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.389121 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.656272 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.656588 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.658114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.658169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.658191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:29 crc kubenswrapper[4755]: I0317 00:22:29.659082 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:29 crc kubenswrapper[4755]: E0317 00:22:29.659400 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:30 crc kubenswrapper[4755]: I0317 00:22:30.177845 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:30Z is after 2026-02-23T05:33:13Z Mar 17 00:22:30 crc kubenswrapper[4755]: E0317 00:22:30.433225 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:22:31 crc kubenswrapper[4755]: I0317 00:22:31.175663 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:31Z is after 2026-02-23T05:33:13Z Mar 17 00:22:31 crc kubenswrapper[4755]: W0317 00:22:31.342727 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:31Z is after 2026-02-23T05:33:13Z Mar 17 00:22:31 crc kubenswrapper[4755]: E0317 00:22:31.342807 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:32 crc kubenswrapper[4755]: I0317 00:22:32.175818 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:32Z is after 2026-02-23T05:33:13Z Mar 17 00:22:32 crc kubenswrapper[4755]: W0317 00:22:32.635118 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:32Z is after 2026-02-23T05:33:13Z Mar 17 00:22:32 crc kubenswrapper[4755]: E0317 00:22:32.635227 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.177397 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:33Z is after 2026-02-23T05:33:13Z Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.821068 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.822655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.822724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.822749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:33 crc kubenswrapper[4755]: I0317 00:22:33.822792 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:33 crc kubenswrapper[4755]: E0317 00:22:33.828548 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 00:22:33 crc kubenswrapper[4755]: E0317 00:22:33.844859 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 00:22:34 crc kubenswrapper[4755]: I0317 00:22:34.176245 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:34Z is after 2026-02-23T05:33:13Z Mar 17 00:22:34 crc kubenswrapper[4755]: W0317 00:22:34.556260 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:34Z is after 2026-02-23T05:33:13Z Mar 17 00:22:34 crc kubenswrapper[4755]: E0317 00:22:34.556360 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:35 crc kubenswrapper[4755]: I0317 00:22:35.175406 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:35Z is after 2026-02-23T05:33:13Z Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.175137 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:36Z is after 2026-02-23T05:33:13Z Mar 17 00:22:36 crc kubenswrapper[4755]: E0317 00:22:36.321060 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.441781 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.441873 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.441962 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.442186 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.443782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.443840 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.443866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.444639 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"976fbb4e200913ee493f78d4c7f9bfbfbf9bbe14c5d7c7db73d6189a727907c5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 17 00:22:36 crc kubenswrapper[4755]: I0317 00:22:36.444965 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://976fbb4e200913ee493f78d4c7f9bfbfbf9bbe14c5d7c7db73d6189a727907c5" gracePeriod=30 Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.176258 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:37Z is after 2026-02-23T05:33:13Z Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.412088 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.413161 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="976fbb4e200913ee493f78d4c7f9bfbfbf9bbe14c5d7c7db73d6189a727907c5" exitCode=255 Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.413266 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"976fbb4e200913ee493f78d4c7f9bfbfbf9bbe14c5d7c7db73d6189a727907c5"} Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.413350 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d41a3434c5d25d0722cada8bd205ad0d258450e5bd95b0e2ba48d47fafca5f7c"} Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.413555 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.414831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.414900 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:37 crc kubenswrapper[4755]: I0317 00:22:37.414919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.174632 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:38Z is after 2026-02-23T05:33:13Z Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.286741 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.416080 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.417249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.417280 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:38 crc kubenswrapper[4755]: I0317 00:22:38.417288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:39 crc kubenswrapper[4755]: I0317 00:22:39.176738 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:39Z is after 2026-02-23T05:33:13Z Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.175757 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z Mar 17 00:22:40 crc kubenswrapper[4755]: E0317 00:22:40.439857 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.829577 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.831762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.831843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.831861 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:40 crc kubenswrapper[4755]: I0317 00:22:40.831904 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:40 crc kubenswrapper[4755]: E0317 00:22:40.837662 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 00:22:40 crc kubenswrapper[4755]: E0317 00:22:40.851173 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 00:22:40 crc kubenswrapper[4755]: W0317 00:22:40.927793 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z Mar 17 00:22:40 crc kubenswrapper[4755]: E0317 00:22:40.927918 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:41 crc kubenswrapper[4755]: I0317 00:22:41.176340 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:41Z is after 2026-02-23T05:33:13Z Mar 17 00:22:42 crc kubenswrapper[4755]: I0317 00:22:42.175995 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:42Z is after 2026-02-23T05:33:13Z Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.174632 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:43Z is after 2026-02-23T05:33:13Z Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.440980 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.441168 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.442721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.442786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:43 crc kubenswrapper[4755]: I0317 00:22:43.442809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.175802 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:44Z is after 2026-02-23T05:33:13Z Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.247395 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.249007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.249079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.249102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:44 crc kubenswrapper[4755]: I0317 00:22:44.250064 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.177699 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:45Z is after 2026-02-23T05:33:13Z Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.435415 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.435981 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.438492 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" exitCode=255 Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.438532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216"} Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.438570 4755 scope.go:117] "RemoveContainer" containerID="44a13151c24edfc60c4b745f1e880034bd1217fb040684df582bae6b57c0c2a3" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.438749 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.440146 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.440186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.440202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:45 crc kubenswrapper[4755]: I0317 00:22:45.440960 4755 scope.go:117] "RemoveContainer" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" Mar 17 00:22:45 crc kubenswrapper[4755]: E0317 00:22:45.441245 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:46 crc kubenswrapper[4755]: I0317 00:22:46.158981 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:22:46 crc kubenswrapper[4755]: E0317 00:22:46.162032 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:46 crc kubenswrapper[4755]: E0317 00:22:46.163284 4755 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 17 00:22:46 crc kubenswrapper[4755]: I0317 00:22:46.173346 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:46Z is after 2026-02-23T05:33:13Z Mar 17 00:22:46 crc kubenswrapper[4755]: E0317 00:22:46.321206 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:46 crc kubenswrapper[4755]: I0317 00:22:46.440953 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:22:46 crc kubenswrapper[4755]: I0317 00:22:46.441024 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 00:22:46 crc kubenswrapper[4755]: I0317 00:22:46.444901 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.176049 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:47Z is after 2026-02-23T05:33:13Z Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.838509 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.840029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.840069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.840081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:47 crc kubenswrapper[4755]: I0317 00:22:47.840117 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:47 crc kubenswrapper[4755]: E0317 00:22:47.843024 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 00:22:47 crc kubenswrapper[4755]: E0317 00:22:47.854730 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 00:22:48 crc kubenswrapper[4755]: I0317 00:22:48.176057 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:48Z is after 2026-02-23T05:33:13Z Mar 17 00:22:48 crc kubenswrapper[4755]: W0317 00:22:48.679289 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:48Z is after 2026-02-23T05:33:13Z Mar 17 00:22:48 crc kubenswrapper[4755]: E0317 00:22:48.679412 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.175594 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:49Z is after 2026-02-23T05:33:13Z Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.656482 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.656895 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.658046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.658120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.658143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:49 crc kubenswrapper[4755]: I0317 00:22:49.658952 4755 scope.go:117] "RemoveContainer" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" Mar 17 00:22:49 crc kubenswrapper[4755]: E0317 00:22:49.659260 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:50 crc kubenswrapper[4755]: I0317 00:22:50.176188 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:50Z is after 2026-02-23T05:33:13Z Mar 17 00:22:50 crc kubenswrapper[4755]: E0317 00:22:50.445489 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:22:51 crc kubenswrapper[4755]: I0317 00:22:51.176702 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:51Z is after 2026-02-23T05:33:13Z Mar 17 00:22:52 crc kubenswrapper[4755]: I0317 00:22:52.173944 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T00:22:52Z is after 2026-02-23T05:33:13Z Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.176317 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.319934 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.320227 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.321785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.321845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.321868 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:53 crc kubenswrapper[4755]: I0317 00:22:53.322759 4755 scope.go:117] "RemoveContainer" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" Mar 17 00:22:53 crc kubenswrapper[4755]: E0317 00:22:53.323086 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:22:53 crc kubenswrapper[4755]: W0317 00:22:53.728840 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 17 00:22:53 crc kubenswrapper[4755]: E0317 00:22:53.728922 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.179999 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.843176 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.844963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.845037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.845061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:22:54 crc kubenswrapper[4755]: I0317 00:22:54.845104 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:22:54 crc kubenswrapper[4755]: E0317 00:22:54.849617 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 00:22:54 crc kubenswrapper[4755]: E0317 00:22:54.862060 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 00:22:55 crc kubenswrapper[4755]: I0317 00:22:55.173282 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:56 crc kubenswrapper[4755]: I0317 00:22:56.179018 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:56 crc kubenswrapper[4755]: E0317 00:22:56.321332 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:22:56 crc kubenswrapper[4755]: I0317 00:22:56.441997 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:22:56 crc kubenswrapper[4755]: I0317 00:22:56.442084 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 00:22:57 crc kubenswrapper[4755]: I0317 00:22:57.178405 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:58 crc kubenswrapper[4755]: I0317 00:22:58.174923 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:59 crc kubenswrapper[4755]: I0317 00:22:59.178701 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:59 crc kubenswrapper[4755]: W0317 00:22:59.993674 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 17 00:22:59 crc kubenswrapper[4755]: E0317 00:22:59.993736 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.124967 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.125189 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.126917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.126959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.126975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:00 crc kubenswrapper[4755]: I0317 00:23:00.178227 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.453133 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d456345f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,LastTimestamp:2026-03-17 00:22:06.159513072 +0000 UTC m=+0.918965405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.459944 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.466563 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.473113 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.480009 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d4f35812a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.324285738 +0000 UTC m=+1.083738021,LastTimestamp:2026-03-17 00:22:06.324285738 +0000 UTC m=+1.083738021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.486273 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.349614354 +0000 UTC m=+1.109066637,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.497364 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.349634295 +0000 UTC m=+1.109086568,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.502347 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.349645235 +0000 UTC m=+1.109097518,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.506815 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.350888782 +0000 UTC m=+1.110341065,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.512981 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.350901152 +0000 UTC m=+1.110353435,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.517849 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.350908333 +0000 UTC m=+1.110360616,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.523907 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.351292584 +0000 UTC m=+1.110744867,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.531172 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.351308604 +0000 UTC m=+1.110760887,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.535356 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.351315684 +0000 UTC m=+1.110767967,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.541271 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.351540691 +0000 UTC m=+1.110992974,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.548559 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.351548901 +0000 UTC m=+1.111001184,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.555203 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.351556891 +0000 UTC m=+1.111009174,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.561794 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.352538741 +0000 UTC m=+1.111991064,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.568100 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.352571821 +0000 UTC m=+1.112024144,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.574874 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.352591782 +0000 UTC m=+1.112044065,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.581219 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.352601212 +0000 UTC m=+1.112053495,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.585689 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.352609773 +0000 UTC m=+1.112062056,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.590225 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5b811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5b811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218393617 +0000 UTC m=+0.977845910,LastTimestamp:2026-03-17 00:22:06.352618283 +0000 UTC m=+1.112070566,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.594605 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e5faa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e5faa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218410657 +0000 UTC m=+0.977862950,LastTimestamp:2026-03-17 00:22:06.352629043 +0000 UTC m=+1.112081326,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.598810 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d790d48e62716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d790d48e62716 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.218422038 +0000 UTC m=+0.977874331,LastTimestamp:2026-03-17 00:22:06.352637403 +0000 UTC m=+1.112089686,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.606522 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790d67bcadca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.735797706 +0000 UTC m=+1.495249999,LastTimestamp:2026-03-17 00:22:06.735797706 +0000 UTC m=+1.495249999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.612483 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790d67bdf92a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.735882538 +0000 UTC m=+1.495334841,LastTimestamp:2026-03-17 00:22:06.735882538 +0000 UTC m=+1.495334841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.618869 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790d684f5e93 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.745411219 +0000 UTC m=+1.504863512,LastTimestamp:2026-03-17 00:22:06.745411219 +0000 UTC m=+1.504863512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.624596 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790d68fe7d5a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.756887898 +0000 UTC m=+1.516340181,LastTimestamp:2026-03-17 00:22:06.756887898 +0000 UTC m=+1.516340181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.630046 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790d69187227 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:06.758588967 +0000 UTC m=+1.518041290,LastTimestamp:2026-03-17 00:22:06.758588967 +0000 UTC m=+1.518041290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.637694 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790da35a1440 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.735968832 +0000 UTC m=+2.495421115,LastTimestamp:2026-03-17 00:22:07.735968832 +0000 UTC m=+2.495421115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.642858 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790da360eb47 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.736417095 +0000 UTC m=+2.495869378,LastTimestamp:2026-03-17 00:22:07.736417095 +0000 UTC m=+2.495869378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.650560 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790da362d7de openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.736543198 +0000 UTC m=+2.495995481,LastTimestamp:2026-03-17 00:22:07.736543198 +0000 UTC m=+2.495995481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.657231 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790da36594ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.736722605 +0000 UTC m=+2.496174908,LastTimestamp:2026-03-17 00:22:07.736722605 +0000 UTC m=+2.496174908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.663762 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790da36b69b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.737104816 +0000 UTC m=+2.496557109,LastTimestamp:2026-03-17 00:22:07.737104816 +0000 UTC m=+2.496557109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.670162 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790da3eea323 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.745704739 +0000 UTC m=+2.505157022,LastTimestamp:2026-03-17 00:22:07.745704739 +0000 UTC m=+2.505157022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.676565 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790da40b6a44 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.747590724 +0000 UTC m=+2.507043007,LastTimestamp:2026-03-17 00:22:07.747590724 +0000 UTC m=+2.507043007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.682680 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790da417fa93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.748414099 +0000 UTC m=+2.507866382,LastTimestamp:2026-03-17 00:22:07.748414099 +0000 UTC m=+2.507866382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.688993 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790da468a7f5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.753701365 +0000 UTC m=+2.513153678,LastTimestamp:2026-03-17 00:22:07.753701365 +0000 UTC m=+2.513153678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.695625 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790da46bc797 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.753906071 +0000 UTC m=+2.513358364,LastTimestamp:2026-03-17 00:22:07.753906071 +0000 UTC m=+2.513358364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.703037 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790da4732dab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.754390955 +0000 UTC m=+2.513843248,LastTimestamp:2026-03-17 00:22:07.754390955 +0000 UTC m=+2.513843248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.710273 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790db5bd9602 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.044480002 +0000 UTC m=+2.803932335,LastTimestamp:2026-03-17 00:22:08.044480002 +0000 UTC m=+2.803932335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.716513 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790db693a38f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.058508175 +0000 UTC m=+2.817960458,LastTimestamp:2026-03-17 00:22:08.058508175 +0000 UTC m=+2.817960458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.723099 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790db6a8f515 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.059905301 +0000 UTC m=+2.819357584,LastTimestamp:2026-03-17 00:22:08.059905301 +0000 UTC m=+2.819357584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.729536 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dc32df118 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.26994716 +0000 UTC m=+3.029399443,LastTimestamp:2026-03-17 00:22:08.26994716 +0000 UTC m=+3.029399443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.736958 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790dc35b0b2e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.272902958 +0000 UTC m=+3.032355281,LastTimestamp:2026-03-17 00:22:08.272902958 +0000 UTC m=+3.032355281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.744006 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790dc365f492 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.273618066 +0000 UTC m=+3.033070349,LastTimestamp:2026-03-17 00:22:08.273618066 +0000 UTC m=+3.033070349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.751804 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790dc3eaa92b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.282315051 +0000 UTC m=+3.041767344,LastTimestamp:2026-03-17 00:22:08.282315051 +0000 UTC m=+3.041767344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.759826 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790dc4b0e142 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.295305538 +0000 UTC m=+3.054757821,LastTimestamp:2026-03-17 00:22:08.295305538 +0000 UTC m=+3.054757821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.766528 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790dc64c2b49 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.322259785 +0000 UTC m=+3.081712068,LastTimestamp:2026-03-17 00:22:08.322259785 +0000 UTC m=+3.081712068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.771842 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790dc676f836 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.325064758 +0000 UTC m=+3.084517041,LastTimestamp:2026-03-17 00:22:08.325064758 +0000 UTC m=+3.084517041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.778661 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790dd1da6a4d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.516131405 +0000 UTC m=+3.275583688,LastTimestamp:2026-03-17 00:22:08.516131405 +0000 UTC m=+3.275583688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.786179 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790dd21dc08f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.520544399 +0000 UTC m=+3.279996682,LastTimestamp:2026-03-17 00:22:08.520544399 +0000 UTC m=+3.279996682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.795807 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dd22a4843 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.521365571 +0000 UTC m=+3.280817854,LastTimestamp:2026-03-17 00:22:08.521365571 +0000 UTC m=+3.280817854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.804347 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790dd22ba9f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.521456113 +0000 UTC m=+3.280908396,LastTimestamp:2026-03-17 00:22:08.521456113 +0000 UTC m=+3.280908396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.812009 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790dd2421ce3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.522927331 +0000 UTC m=+3.282379624,LastTimestamp:2026-03-17 00:22:08.522927331 +0000 UTC m=+3.282379624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.819277 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d790dd2daf2aa openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.53294353 +0000 UTC m=+3.292395813,LastTimestamp:2026-03-17 00:22:08.53294353 +0000 UTC m=+3.292395813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.824254 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790dd304393b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.535648571 +0000 UTC m=+3.295100854,LastTimestamp:2026-03-17 00:22:08.535648571 +0000 UTC m=+3.295100854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.831195 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790dd306dd07 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.535821575 +0000 UTC m=+3.295273858,LastTimestamp:2026-03-17 00:22:08.535821575 +0000 UTC m=+3.295273858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.838677 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790dd314af17 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.536727319 +0000 UTC m=+3.296179602,LastTimestamp:2026-03-17 00:22:08.536727319 +0000 UTC m=+3.296179602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.845817 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790dd362db0e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.541850382 +0000 UTC m=+3.301302665,LastTimestamp:2026-03-17 00:22:08.541850382 +0000 UTC m=+3.301302665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.852864 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dd3630ebc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.541863612 +0000 UTC m=+3.301315895,LastTimestamp:2026-03-17 00:22:08.541863612 +0000 UTC m=+3.301315895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.859342 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dd384317e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.544035198 +0000 UTC m=+3.303487481,LastTimestamp:2026-03-17 00:22:08.544035198 +0000 UTC m=+3.303487481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.865983 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790dddd5e201 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.717160961 +0000 UTC m=+3.476613254,LastTimestamp:2026-03-17 00:22:08.717160961 +0000 UTC m=+3.476613254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.872334 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dddd7c7ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.717285294 +0000 UTC m=+3.476737587,LastTimestamp:2026-03-17 00:22:08.717285294 +0000 UTC m=+3.476737587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.879860 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790ddf09fa6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.737352303 +0000 UTC m=+3.496804596,LastTimestamp:2026-03-17 00:22:08.737352303 +0000 UTC m=+3.496804596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.885181 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790ddf24eb69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.739117929 +0000 UTC m=+3.498570222,LastTimestamp:2026-03-17 00:22:08.739117929 +0000 UTC m=+3.498570222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.891510 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790ddf49723d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.741511741 +0000 UTC m=+3.500964034,LastTimestamp:2026-03-17 00:22:08.741511741 +0000 UTC m=+3.500964034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.897849 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790ddf5624e9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.742343913 +0000 UTC m=+3.501796196,LastTimestamp:2026-03-17 00:22:08.742343913 +0000 UTC m=+3.501796196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.904223 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790deaf191ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.937079226 +0000 UTC m=+3.696531509,LastTimestamp:2026-03-17 00:22:08.937079226 +0000 UTC m=+3.696531509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.908535 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790deafb16a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.937703081 +0000 UTC m=+3.697155374,LastTimestamp:2026-03-17 00:22:08.937703081 +0000 UTC m=+3.697155374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.913820 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d790debca5b55 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.951286613 +0000 UTC m=+3.710738906,LastTimestamp:2026-03-17 00:22:08.951286613 +0000 UTC m=+3.710738906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.920839 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dec21e86a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.957024362 +0000 UTC m=+3.716476655,LastTimestamp:2026-03-17 00:22:08.957024362 +0000 UTC m=+3.716476655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.925967 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790dec3003f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.957948916 +0000 UTC m=+3.717401209,LastTimestamp:2026-03-17 00:22:08.957948916 +0000 UTC m=+3.717401209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.931093 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790df6ce46a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.136092839 +0000 UTC m=+3.895545132,LastTimestamp:2026-03-17 00:22:09.136092839 +0000 UTC m=+3.895545132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.936222 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790df7570402 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.14505421 +0000 UTC m=+3.904506493,LastTimestamp:2026-03-17 00:22:09.14505421 +0000 UTC m=+3.904506493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.942295 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790df763e03f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.145897023 +0000 UTC m=+3.905349306,LastTimestamp:2026-03-17 00:22:09.145897023 +0000 UTC m=+3.905349306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.948733 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e001f9303 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.292415747 +0000 UTC m=+4.051868030,LastTimestamp:2026-03-17 00:22:09.292415747 +0000 UTC m=+4.051868030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.952751 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790e0320b866 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.342822502 +0000 UTC m=+4.102274785,LastTimestamp:2026-03-17 00:22:09.342822502 +0000 UTC m=+4.102274785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.959077 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790e03b59f33 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.352580915 +0000 UTC m=+4.112033198,LastTimestamp:2026-03-17 00:22:09.352580915 +0000 UTC m=+4.112033198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.966393 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e0c2fc7b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.494804407 +0000 UTC m=+4.254256690,LastTimestamp:2026-03-17 00:22:09.494804407 +0000 UTC m=+4.254256690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.972854 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e0d14901c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.509797916 +0000 UTC m=+4.269250199,LastTimestamp:2026-03-17 00:22:09.509797916 +0000 UTC m=+4.269250199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.980373 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e3c98952e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.306979118 +0000 UTC m=+5.066431431,LastTimestamp:2026-03-17 00:22:10.306979118 +0000 UTC m=+5.066431431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.986947 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e4abe6cce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.544340174 +0000 UTC m=+5.303792467,LastTimestamp:2026-03-17 00:22:10.544340174 +0000 UTC m=+5.303792467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:00 crc kubenswrapper[4755]: E0317 00:23:00.993952 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e4b636332 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.555151154 +0000 UTC m=+5.314603477,LastTimestamp:2026-03-17 00:22:10.555151154 +0000 UTC m=+5.314603477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.000160 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e4b780b1b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.556504859 +0000 UTC m=+5.315957192,LastTimestamp:2026-03-17 00:22:10.556504859 +0000 UTC m=+5.315957192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.006642 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e5ce4656a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.848818538 +0000 UTC m=+5.608270851,LastTimestamp:2026-03-17 00:22:10.848818538 +0000 UTC m=+5.608270851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.010895 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e5dfa9c00 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.86705152 +0000 UTC m=+5.626503843,LastTimestamp:2026-03-17 00:22:10.86705152 +0000 UTC m=+5.626503843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.016896 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e5e13378a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:10.868664202 +0000 UTC m=+5.628116525,LastTimestamp:2026-03-17 00:22:10.868664202 +0000 UTC m=+5.628116525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.023031 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e6c2ad000 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.105091584 +0000 UTC m=+5.864543877,LastTimestamp:2026-03-17 00:22:11.105091584 +0000 UTC m=+5.864543877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.028994 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e6cdafe6c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.116637804 +0000 UTC m=+5.876090087,LastTimestamp:2026-03-17 00:22:11.116637804 +0000 UTC m=+5.876090087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.035112 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e6ce9903e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.117592638 +0000 UTC m=+5.877044921,LastTimestamp:2026-03-17 00:22:11.117592638 +0000 UTC m=+5.877044921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.043016 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e78fd5d71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.320216945 +0000 UTC m=+6.079669248,LastTimestamp:2026-03-17 00:22:11.320216945 +0000 UTC m=+6.079669248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.050212 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e7a03a482 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.33740557 +0000 UTC m=+6.096857853,LastTimestamp:2026-03-17 00:22:11.33740557 +0000 UTC m=+6.096857853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.056560 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e7a160033 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.338608691 +0000 UTC m=+6.098060974,LastTimestamp:2026-03-17 00:22:11.338608691 +0000 UTC m=+6.098060974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.063350 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e868a37cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.547551691 +0000 UTC m=+6.307003974,LastTimestamp:2026-03-17 00:22:11.547551691 +0000 UTC m=+6.307003974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.069937 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d790e87555f78 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:11.560865656 +0000 UTC m=+6.320317939,LastTimestamp:2026-03-17 00:22:11.560865656 +0000 UTC m=+6.320317939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.079477 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189d790faa3bf1cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:16.441369036 +0000 UTC m=+11.200821359,LastTimestamp:2026-03-17 00:22:16.441369036 +0000 UTC m=+11.200821359,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.085908 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790faa3daa97 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:16.441481879 +0000 UTC m=+11.200934192,LastTimestamp:2026-03-17 00:22:16.441481879 +0000 UTC m=+11.200934192,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.093529 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189d791069e32abe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:19.656776382 +0000 UTC m=+14.416228665,LastTimestamp:2026-03-17 00:22:19.656776382 +0000 UTC m=+14.416228665,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.100121 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d791069e419a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:19.656837543 +0000 UTC m=+14.416289826,LastTimestamp:2026-03-17 00:22:19.656837543 +0000 UTC m=+14.416289826,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.106738 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189d7910981c1a56 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 00:23:01 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 00:23:01 crc kubenswrapper[4755]: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:20.43225967 +0000 UTC m=+15.191711973,LastTimestamp:2026-03-17 00:22:20.43225967 +0000 UTC m=+15.191711973,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.113100 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7910981d4522 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:20.432336162 +0000 UTC m=+15.191788455,LastTimestamp:2026-03-17 00:22:20.432336162 +0000 UTC m=+15.191788455,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.118569 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d7910981c1a56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189d7910981c1a56 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 00:23:01 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 00:23:01 crc kubenswrapper[4755]: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:20.43225967 +0000 UTC m=+15.191711973,LastTimestamp:2026-03-17 00:22:20.444066905 +0000 UTC m=+15.203519198,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.123883 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d7910981d4522\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d7910981d4522 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:20.432336162 +0000 UTC m=+15.191788455,LastTimestamp:2026-03-17 00:22:20.444108586 +0000 UTC m=+15.203560879,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.130946 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d790df763e03f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d790df763e03f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:09.145897023 +0000 UTC m=+3.905349306,LastTimestamp:2026-03-17 00:22:21.355987732 +0000 UTC m=+16.115440055,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.142015 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4d18fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441713916 +0000 UTC m=+21.201166289,LastTimestamp:2026-03-17 00:22:26.441713916 +0000 UTC m=+21.201166289,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.149695 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4e95d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441811408 +0000 UTC m=+21.201263741,LastTimestamp:2026-03-17 00:22:26.441811408 +0000 UTC m=+21.201263741,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.157658 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7911fe4d18fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4d18fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441713916 +0000 UTC m=+21.201166289,LastTimestamp:2026-03-17 00:22:36.441848229 +0000 UTC m=+31.201300552,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.161396 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7911fe4e95d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4e95d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441811408 +0000 UTC m=+21.201263741,LastTimestamp:2026-03-17 00:22:36.441921821 +0000 UTC m=+31.201374144,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.165593 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d791452897868 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:36.44489124 +0000 UTC m=+31.204343593,LastTimestamp:2026-03-17 00:22:36.44489124 +0000 UTC m=+31.204343593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.171499 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d790da417fa93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790da417fa93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:07.748414099 +0000 UTC m=+2.507866382,LastTimestamp:2026-03-17 00:22:36.568112641 +0000 UTC m=+31.327564954,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.177714 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d790db5bd9602\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790db5bd9602 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.044480002 +0000 UTC m=+2.803932335,LastTimestamp:2026-03-17 00:22:36.845258658 +0000 UTC m=+31.604710951,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.177962 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.181231 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d790db693a38f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d790db693a38f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:08.058508175 +0000 UTC m=+2.817960458,LastTimestamp:2026-03-17 00:22:36.858217391 +0000 UTC m=+31.617669704,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.185595 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7911fe4d18fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4d18fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441713916 +0000 UTC m=+21.201166289,LastTimestamp:2026-03-17 00:22:46.441004084 +0000 UTC m=+41.200456407,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.190068 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7911fe4e95d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4e95d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441811408 +0000 UTC m=+21.201263741,LastTimestamp:2026-03-17 00:22:46.441056405 +0000 UTC m=+41.200508728,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.193572 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d7911fe4d18fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 00:23:01 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189d7911fe4d18fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 00:23:01 crc kubenswrapper[4755]: body: Mar 17 00:23:01 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:22:26.441713916 +0000 UTC m=+21.201166289,LastTimestamp:2026-03-17 00:22:56.442054319 +0000 UTC m=+51.201506602,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 00:23:01 crc kubenswrapper[4755]: > Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.850072 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.851761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.851810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.851825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:01 crc kubenswrapper[4755]: I0317 00:23:01.851854 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.859470 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 00:23:01 crc kubenswrapper[4755]: E0317 00:23:01.870514 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 00:23:02 crc kubenswrapper[4755]: I0317 00:23:02.175967 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:03 crc kubenswrapper[4755]: I0317 00:23:03.176891 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.178529 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.283857 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.284117 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.285768 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.285822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.285834 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.288361 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.501685 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.502785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.502847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:04 crc kubenswrapper[4755]: I0317 00:23:04.502870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:05 crc kubenswrapper[4755]: I0317 00:23:05.177145 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.176463 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.247375 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.248765 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.248807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.248817 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.249504 4755 scope.go:117] "RemoveContainer" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" Mar 17 00:23:06 crc kubenswrapper[4755]: E0317 00:23:06.321596 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.511193 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.513998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a"} Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.514194 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.515245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.515284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:06 crc kubenswrapper[4755]: I0317 00:23:06.515297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:07 crc kubenswrapper[4755]: I0317 00:23:07.177045 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.175514 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.520394 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.520954 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.522418 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" exitCode=255 Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.522469 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a"} Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.522509 4755 scope.go:117] "RemoveContainer" containerID="44a07a1748de0208e8df1c37e392566a54d149f18e23168c3ade590aef356216" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.522695 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.524596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.524640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.524655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.526594 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:08 crc kubenswrapper[4755]: E0317 00:23:08.526993 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.859774 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.861074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.861204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.861299 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:08 crc kubenswrapper[4755]: I0317 00:23:08.861404 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:23:08 crc kubenswrapper[4755]: E0317 00:23:08.865547 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 00:23:08 crc kubenswrapper[4755]: E0317 00:23:08.875713 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.176153 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.525734 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.656642 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.656876 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.658131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.658166 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.658179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:09 crc kubenswrapper[4755]: I0317 00:23:09.658834 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:09 crc kubenswrapper[4755]: E0317 00:23:09.659017 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:10 crc kubenswrapper[4755]: I0317 00:23:10.182551 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:11 crc kubenswrapper[4755]: I0317 00:23:11.177934 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:12 crc kubenswrapper[4755]: I0317 00:23:12.175600 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.176174 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.320404 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.321060 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.322510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.322563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.322574 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:13 crc kubenswrapper[4755]: I0317 00:23:13.323092 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:13 crc kubenswrapper[4755]: E0317 00:23:13.323249 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:14 crc kubenswrapper[4755]: I0317 00:23:14.178942 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.177827 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.866299 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.867671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.867763 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.867791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:15 crc kubenswrapper[4755]: I0317 00:23:15.867835 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:23:15 crc kubenswrapper[4755]: E0317 00:23:15.872809 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 00:23:15 crc kubenswrapper[4755]: E0317 00:23:15.881781 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 00:23:16 crc kubenswrapper[4755]: I0317 00:23:16.178758 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:16 crc kubenswrapper[4755]: E0317 00:23:16.322595 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:17 crc kubenswrapper[4755]: I0317 00:23:17.175864 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:18 crc kubenswrapper[4755]: I0317 00:23:18.164552 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 00:23:18 crc kubenswrapper[4755]: I0317 00:23:18.178228 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 00:23:18 crc kubenswrapper[4755]: I0317 00:23:18.178534 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:19 crc kubenswrapper[4755]: I0317 00:23:19.176371 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:19 crc kubenswrapper[4755]: W0317 00:23:19.932972 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 17 00:23:19 crc kubenswrapper[4755]: E0317 00:23:19.933055 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 00:23:20 crc kubenswrapper[4755]: I0317 00:23:20.176117 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:20 crc kubenswrapper[4755]: W0317 00:23:20.854408 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 17 00:23:20 crc kubenswrapper[4755]: E0317 00:23:20.854491 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 00:23:21 crc kubenswrapper[4755]: I0317 00:23:21.177943 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 00:23:21 crc kubenswrapper[4755]: I0317 00:23:21.715478 4755 csr.go:261] certificate signing request csr-ppcl5 is approved, waiting to be issued Mar 17 00:23:21 crc kubenswrapper[4755]: I0317 00:23:21.727242 4755 csr.go:257] certificate signing request csr-ppcl5 is issued Mar 17 00:23:21 crc kubenswrapper[4755]: I0317 00:23:21.755656 4755 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.000895 4755 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.756684 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 06:33:17.821479328 +0000 UTC Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.756762 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6390h9m55.064727603s for next certificate rotation Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.873493 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.875195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.875251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.875270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.875388 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.888194 4755 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.888535 4755 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.888564 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.893519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.893588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.893605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.893631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.893671 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:22Z","lastTransitionTime":"2026-03-17T00:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.913348 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.922554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.922579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.922590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.922606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.922619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:22Z","lastTransitionTime":"2026-03-17T00:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.933607 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.944156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.944195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.944211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.944230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.944244 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:22Z","lastTransitionTime":"2026-03-17T00:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.959579 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.970086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.970153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.970183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.970217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:22 crc kubenswrapper[4755]: I0317 00:23:22.970241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:22Z","lastTransitionTime":"2026-03-17T00:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.983281 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.983624 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:23:22 crc kubenswrapper[4755]: E0317 00:23:22.983668 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.083811 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.184814 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.285234 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: I0317 00:23:23.336016 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.385893 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.486642 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.586749 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.687476 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.787748 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.888174 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:23 crc kubenswrapper[4755]: E0317 00:23:23.989258 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.089825 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.190607 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: I0317 00:23:24.247751 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:24 crc kubenswrapper[4755]: I0317 00:23:24.249340 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:24 crc kubenswrapper[4755]: I0317 00:23:24.249397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:24 crc kubenswrapper[4755]: I0317 00:23:24.249414 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.290848 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.391475 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.491797 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.591947 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.692553 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.793619 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.894627 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:24 crc kubenswrapper[4755]: E0317 00:23:24.995540 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.096548 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.197531 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.297762 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.398629 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.499388 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.599791 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.700568 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.801738 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:25 crc kubenswrapper[4755]: E0317 00:23:25.901949 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.002386 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.103059 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.203365 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.304586 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.322815 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.404805 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.505837 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.606805 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.707372 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.807569 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:26 crc kubenswrapper[4755]: E0317 00:23:26.907722 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.008179 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.108984 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.210035 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.311046 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.411891 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.512208 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.613189 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.714080 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.814789 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:27 crc kubenswrapper[4755]: E0317 00:23:27.915177 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.016280 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.116548 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.217142 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.317598 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.418053 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.519221 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.620217 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.720758 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.821086 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:28 crc kubenswrapper[4755]: E0317 00:23:28.921981 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.022795 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.123401 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.223812 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: I0317 00:23:29.247277 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:29 crc kubenswrapper[4755]: I0317 00:23:29.249056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:29 crc kubenswrapper[4755]: I0317 00:23:29.249109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:29 crc kubenswrapper[4755]: I0317 00:23:29.249122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:29 crc kubenswrapper[4755]: I0317 00:23:29.249695 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.249866 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.324216 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.424758 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.524906 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.625510 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.726502 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.827010 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:29 crc kubenswrapper[4755]: E0317 00:23:29.927570 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.028367 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.128934 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.229891 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.330672 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.431805 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.532789 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.633410 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.734310 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.834794 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:30 crc kubenswrapper[4755]: E0317 00:23:30.935577 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.036414 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.137231 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.237923 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.338816 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.439511 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.540015 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.640867 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.741402 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.841562 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:31 crc kubenswrapper[4755]: E0317 00:23:31.942639 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.043526 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.144133 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.245034 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.345846 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.446622 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.547000 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.647173 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.747951 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.848631 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:32 crc kubenswrapper[4755]: E0317 00:23:32.949281 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.050049 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.084371 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.088380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.088430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.088468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.088489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.088503 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:33Z","lastTransitionTime":"2026-03-17T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.097836 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.102706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.102740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.102753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.102769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.102781 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:33Z","lastTransitionTime":"2026-03-17T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.117039 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.121037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.121097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.121117 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.121141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.121161 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:33Z","lastTransitionTime":"2026-03-17T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.131644 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.136302 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.136403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.136500 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.136531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.136549 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:33Z","lastTransitionTime":"2026-03-17T00:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.153099 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.153326 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.153367 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.248010 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.249514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.249577 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:33 crc kubenswrapper[4755]: I0317 00:23:33.249602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.253474 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.354080 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.454541 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.555537 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.656684 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.757572 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.857899 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:33 crc kubenswrapper[4755]: E0317 00:23:33.958269 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.059037 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.159346 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.260340 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.361147 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.461735 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.562216 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.662891 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.763008 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.864152 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:34 crc kubenswrapper[4755]: E0317 00:23:34.964968 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.065300 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.166494 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.267089 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.367457 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.467797 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.567938 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.668211 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.768542 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.868677 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:35 crc kubenswrapper[4755]: E0317 00:23:35.969396 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.070215 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.170590 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.270901 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.323825 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.371404 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.475923 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.576497 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.677372 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.777805 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.878699 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:36 crc kubenswrapper[4755]: E0317 00:23:36.978828 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.080035 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.180414 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.280764 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.381087 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.481564 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.582310 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.682659 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.783301 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.884555 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:37 crc kubenswrapper[4755]: E0317 00:23:37.985754 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.086327 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.187518 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.287626 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.387814 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.488530 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.589230 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.689735 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.789894 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.890610 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:38 crc kubenswrapper[4755]: E0317 00:23:38.990743 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.091799 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.192615 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.293638 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.394679 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.495786 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.596188 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.697181 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.797325 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.898433 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:39 crc kubenswrapper[4755]: E0317 00:23:39.999387 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.100064 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.200258 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: I0317 00:23:40.247668 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:40 crc kubenswrapper[4755]: I0317 00:23:40.249279 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:40 crc kubenswrapper[4755]: I0317 00:23:40.249358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:40 crc kubenswrapper[4755]: I0317 00:23:40.249383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:40 crc kubenswrapper[4755]: I0317 00:23:40.250645 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.251043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.300873 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.401768 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.502107 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.603035 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.703882 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.804319 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:40 crc kubenswrapper[4755]: E0317 00:23:40.905071 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.005618 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.106043 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.206467 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.307222 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.408400 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.509193 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.609900 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.711129 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.812082 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:41 crc kubenswrapper[4755]: E0317 00:23:41.912157 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.012294 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.113179 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.213921 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.314982 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.415383 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.515593 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.616647 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.717195 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.818084 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:42 crc kubenswrapper[4755]: E0317 00:23:42.919083 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.019190 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.119666 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.220937 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.274925 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.281695 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.281730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.281742 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.281760 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.281776 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:43Z","lastTransitionTime":"2026-03-17T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.299658 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.304325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.304389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.304408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.304475 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.304501 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:43Z","lastTransitionTime":"2026-03-17T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.320420 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.325171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.325262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.325281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.325305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.325322 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:43Z","lastTransitionTime":"2026-03-17T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.337297 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.341425 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.341507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.341526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.341550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:43 crc kubenswrapper[4755]: I0317 00:23:43.341568 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:43Z","lastTransitionTime":"2026-03-17T00:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.356471 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.356593 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.356618 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.457672 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.558509 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.659380 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.759900 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.860421 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:43 crc kubenswrapper[4755]: E0317 00:23:43.961153 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.062155 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.162369 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.263053 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.363830 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.465064 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.565615 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.666685 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.767139 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.868253 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:44 crc kubenswrapper[4755]: E0317 00:23:44.968643 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.069674 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.170882 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.271108 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.371982 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.472490 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.573263 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.674135 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.774351 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.875509 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:45 crc kubenswrapper[4755]: E0317 00:23:45.976342 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: I0317 00:23:46.036826 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.076897 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.177742 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.278327 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.324865 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.379013 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.479921 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.580101 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.680313 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.781365 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.882395 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:46 crc kubenswrapper[4755]: E0317 00:23:46.983537 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.084353 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.185589 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.285702 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.386779 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.486964 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.587703 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.688288 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.789341 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.890296 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:47 crc kubenswrapper[4755]: E0317 00:23:47.990524 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.091689 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.192792 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.292872 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.393360 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.494513 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.595163 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.695935 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.796874 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.897043 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:48 crc kubenswrapper[4755]: E0317 00:23:48.997616 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.098798 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.199813 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.300813 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.401413 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.502571 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.603504 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.704092 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.804645 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:49 crc kubenswrapper[4755]: E0317 00:23:49.905573 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.006512 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.107400 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.208239 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.309081 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.410147 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.510491 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.611292 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.711645 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.812541 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:50 crc kubenswrapper[4755]: E0317 00:23:50.912888 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.013310 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.114174 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.214375 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.315099 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.415963 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.516869 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.617892 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.718549 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.819409 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:51 crc kubenswrapper[4755]: E0317 00:23:51.920136 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.021061 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.121749 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.221922 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.322522 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.422696 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.523477 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.624647 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.725161 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.826200 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:52 crc kubenswrapper[4755]: E0317 00:23:52.927357 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.028531 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.128751 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.229132 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.330179 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.430375 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.531091 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.592281 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.597391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.597482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.597509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.597538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.597559 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:53Z","lastTransitionTime":"2026-03-17T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.614421 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.620168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.620227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.620246 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.620272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.620290 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:53Z","lastTransitionTime":"2026-03-17T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.636295 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.641088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.641234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.641268 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.641296 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.641332 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:53Z","lastTransitionTime":"2026-03-17T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.656867 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.661796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.661847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.661863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.661885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:53 crc kubenswrapper[4755]: I0317 00:23:53.661902 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:53Z","lastTransitionTime":"2026-03-17T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.673579 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.674010 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.674123 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.774672 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.875554 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:53 crc kubenswrapper[4755]: E0317 00:23:53.976638 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.078070 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.179133 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.280171 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.381119 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.482000 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.582122 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.682459 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.783422 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.884652 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:54 crc kubenswrapper[4755]: E0317 00:23:54.985488 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.086515 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.187103 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.247490 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.248984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.249052 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.249069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.250106 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.287618 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.389189 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.490335 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.591089 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.655092 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.657134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1"} Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.657259 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.658076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.658102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:55 crc kubenswrapper[4755]: I0317 00:23:55.658115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.691914 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.792701 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.893396 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:55 crc kubenswrapper[4755]: E0317 00:23:55.994493 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.095105 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.196250 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.297341 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.325660 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.398176 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.498753 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.599564 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.662522 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.663328 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.666049 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" exitCode=255 Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.666086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1"} Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.666121 4755 scope.go:117] "RemoveContainer" containerID="a165024ab5ef6ccc7c4c35507372052cfaf3dfe84bac4d4cd3fbc6fbb834b87a" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.666345 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.667830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.667862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.667873 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:56 crc kubenswrapper[4755]: I0317 00:23:56.668569 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.668806 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.700521 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.801410 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:56 crc kubenswrapper[4755]: E0317 00:23:56.902469 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.002895 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.103034 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.203717 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.304678 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.405060 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.505889 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.606893 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: I0317 00:23:57.671661 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.707665 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.808533 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:57 crc kubenswrapper[4755]: E0317 00:23:57.909619 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.010631 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.111728 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.196151 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.212301 4755 apiserver.go:52] "Watching apiserver" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.214699 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.214767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.214785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.214810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.214828 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.223288 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.224711 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4v74b","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-7cncb","openshift-image-registry/node-ca-hgfhx","openshift-machine-config-operator/machine-config-daemon-bhh2x","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw","openshift-multus/multus-additional-cni-plugins-nfp88","openshift-multus/multus-j6qtr","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-mvdzt"] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.225250 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.225328 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.225489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.225606 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.225617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.226144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.226774 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.227067 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.227173 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.226904 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.227573 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.227587 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.230814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.232061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.232842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.233239 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.233394 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.233643 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.237813 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238173 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238560 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238603 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238676 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238621 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.238768 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.239291 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.239695 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.239505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.240778 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.241187 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.241739 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242099 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242227 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242303 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242339 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242360 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242402 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242503 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.242601 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.243078 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.245010 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.245568 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.246309 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.246805 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.246830 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.246935 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247111 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247197 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247278 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247376 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247615 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.247941 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.248011 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.248272 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.248505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.265566 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.277303 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.282379 4755 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286378 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286488 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286582 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286627 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286663 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286784 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286877 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286907 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286938 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.286971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287023 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287177 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287226 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287241 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287307 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287315 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287597 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287708 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287746 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.287811 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288479 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288632 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288737 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.288803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289373 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289406 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.289985 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290043 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290358 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290355 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290414 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290565 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290614 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290775 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290776 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290940 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290969 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.290999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291022 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291080 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291038 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291113 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291143 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291174 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291201 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291292 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291376 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291391 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291461 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291603 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291632 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291657 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291687 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291728 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291762 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291840 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291869 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291891 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291918 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292072 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292100 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292133 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292183 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292213 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292236 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292266 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292351 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292379 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292434 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293133 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293170 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.294212 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.294549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291485 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291730 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.291932 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292133 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292311 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292398 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292761 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.292849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293311 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293687 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.293697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.294016 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.294320 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.294689 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.295857 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.296067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.296117 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.296507 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.296835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.296961 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297539 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297685 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297716 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297915 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297960 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.297997 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298054 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298096 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298138 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298180 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298665 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.298676 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299259 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299737 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299259 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299799 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299967 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300302 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300344 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300387 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300554 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.299663 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300385 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300674 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.300974 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301393 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301731 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.301975 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302317 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302230 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302623 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302651 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302682 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302824 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302923 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.302985 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.303337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.303741 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.303807 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.304209 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.304795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305590 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305852 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305903 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305930 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305954 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.305982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306005 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.306039 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.806014889 +0000 UTC m=+113.565467182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306038 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306100 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306152 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306239 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306309 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306335 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306357 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306431 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306490 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306778 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306811 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306864 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306888 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306909 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306931 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.306952 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307172 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307193 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307136 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.307214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308557 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308602 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308637 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308779 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308913 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308947 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.308980 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309043 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309108 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309143 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309243 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309279 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309311 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309343 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309410 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309472 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309511 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309577 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309612 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309647 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309682 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309716 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309750 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309787 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309826 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309861 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309926 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309957 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.309990 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-k8s-cni-cncf-io\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310159 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310192 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-etc-kubernetes\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mdp\" (UniqueName: \"kubernetes.io/projected/225816b5-e7fe-4d29-84ad-37187e904104-kube-api-access-76mdp\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310259 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-multus\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-hostroot\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310370 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310433 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/def35a55-2212-4d8e-8040-69fdcc95e34c-hosts-file\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310494 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310526 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-kubelet\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310556 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-os-release\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-mcd-auth-proxy-config\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpscc\" (UniqueName: \"kubernetes.io/projected/becc22f3-961c-4ce7-b97f-6d40e28c9373-kube-api-access-gpscc\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310657 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310693 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-netns\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310724 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310857 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.310887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-proxy-tls\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311478 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311546 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-cni-binary-copy\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311610 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgfzg\" (UniqueName: \"kubernetes.io/projected/de2167ca-ad7e-47ce-bf95-cebc396df145-kube-api-access-zgfzg\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311644 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-system-cni-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311682 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311717 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311796 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311803 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w46z\" (UniqueName: \"kubernetes.io/projected/def35a55-2212-4d8e-8040-69fdcc95e34c-kube-api-access-9w46z\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-rootfs\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2pk\" (UniqueName: \"kubernetes.io/projected/f7291e3d-2994-409e-972a-59394140b3ad-kube-api-access-8b2pk\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6cxf\" (UniqueName: \"kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311937 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.311965 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312081 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc22f3-961c-4ce7-b97f-6d40e28c9373-host\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312213 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-os-release\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312293 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-cnibin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312388 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.312471 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.313034 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.313941 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-cnibin\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314092 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc22f3-961c-4ce7-b97f-6d40e28c9373-serviceca\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314120 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-bin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314293 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314300 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-system-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-socket-dir-parent\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314529 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bzw5\" (UniqueName: \"kubernetes.io/projected/c32c8c1b-db30-4059-97d0-ef753de5e7e0-kube-api-access-8bzw5\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-daemon-config\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314744 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-multus-certs\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314772 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314520 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314578 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314731 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-conf-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.314806 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315120 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315165 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48586\" (UniqueName: \"kubernetes.io/projected/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-kube-api-access-48586\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315770 4755 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.316352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315756 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.315962 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.316230 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.316661 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.316671 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.317291 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.317303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.317488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.317691 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.318848 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.319091 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.320809 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.81917837 +0000 UTC m=+113.578630683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320922 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.320933 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.321003 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.321093 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.821067185 +0000 UTC m=+113.580519488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.321609 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.321695 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323256 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323293 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323314 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323335 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323354 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323372 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323390 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323408 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323426 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323468 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323491 4755 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323510 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323530 4755 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323548 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323564 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323582 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323598 4755 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323615 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323635 4755 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323656 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323673 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323691 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323709 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.323999 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324030 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324052 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324073 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324092 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324114 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324186 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324238 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324262 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324276 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324290 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324305 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324320 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324334 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324350 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324364 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324379 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324393 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324408 4755 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324404 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324425 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324526 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324553 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324574 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324594 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324616 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324638 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324660 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324680 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324700 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324720 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324739 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324759 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324777 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324814 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324835 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324856 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324874 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324894 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324914 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324932 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324950 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324969 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.324988 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325009 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325029 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325048 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325066 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325085 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325104 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325124 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325144 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325164 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325184 4755 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325202 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325223 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325242 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325263 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325281 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325298 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325316 4755 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325334 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325351 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325369 4755 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325387 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325406 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325425 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325685 4755 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325956 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325975 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.325993 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.326012 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.326546 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.326892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.327475 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.331018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.333213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.333625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.334286 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.334342 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.334584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.334610 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.335167 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.335295 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.335316 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.335389 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.335413 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.335546 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.835517088 +0000 UTC m=+113.594969401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.335795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.340419 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.340618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.340925 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.341372 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.340864 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.342113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.342186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.342336 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.343558 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.343744 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.342894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.343888 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.342422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.342990 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.343215 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.343282 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.344278 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.345044 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.344707 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.345267 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.345281 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.345273 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.845200714 +0000 UTC m=+113.604653037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.345388 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.345763 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.346569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.346961 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.349658 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.350884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.350911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.351176 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.351351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.351409 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.351582 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.351900 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.352055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.352305 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.352326 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.353103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.352846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.353047 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.353318 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.353694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.353890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.354085 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.354071 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.354910 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.355419 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.355502 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.355683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.355766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356111 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356253 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356270 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356631 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356771 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.356945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.357078 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.357338 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.357371 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.358038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.358356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.359348 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.359369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.360525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.360718 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.360735 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.361218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.361663 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.368948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.375815 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.378244 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.384673 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.388498 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.390572 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.397585 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.405517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.412265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.417745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.424094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.424249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.424376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.424512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.424648 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.425337 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-os-release\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426857 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc22f3-961c-4ce7-b97f-6d40e28c9373-host\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-os-release\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426938 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.426983 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-cnibin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427025 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-cnibin\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-cnibin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc22f3-961c-4ce7-b97f-6d40e28c9373-serviceca\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-bin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-socket-dir-parent\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427185 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becc22f3-961c-4ce7-b97f-6d40e28c9373-host\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-system-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-system-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-daemon-config\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-multus-certs\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bzw5\" (UniqueName: \"kubernetes.io/projected/c32c8c1b-db30-4059-97d0-ef753de5e7e0-kube-api-access-8bzw5\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427385 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427469 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427566 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-conf-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427735 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48586\" (UniqueName: \"kubernetes.io/projected/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-kube-api-access-48586\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-k8s-cni-cncf-io\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427842 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-daemon-config\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427876 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427910 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-multus\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.427996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-hostroot\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-etc-kubernetes\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428075 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76mdp\" (UniqueName: \"kubernetes.io/projected/225816b5-e7fe-4d29-84ad-37187e904104-kube-api-access-76mdp\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428212 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-cnibin\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/def35a55-2212-4d8e-8040-69fdcc95e34c-hosts-file\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428262 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-netns\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-kubelet\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-multus\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-os-release\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-hostroot\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428317 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-mcd-auth-proxy-config\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-etc-kubernetes\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-os-release\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-kubelet\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/def35a55-2212-4d8e-8040-69fdcc95e34c-hosts-file\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-netns\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428624 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428048 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becc22f3-961c-4ce7-b97f-6d40e28c9373-serviceca\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.428862 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.428908 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:23:58.928892614 +0000 UTC m=+113.688345017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428911 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428988 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-k8s-cni-cncf-io\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-run-multus-certs\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429059 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.428332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpscc\" (UniqueName: \"kubernetes.io/projected/becc22f3-961c-4ce7-b97f-6d40e28c9373-kube-api-access-gpscc\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-conf-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-socket-dir-parent\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429142 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429089 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-host-var-lib-cni-bin\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429287 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgfzg\" (UniqueName: \"kubernetes.io/projected/de2167ca-ad7e-47ce-bf95-cebc396df145-kube-api-access-zgfzg\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429385 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-system-cni-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429525 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429569 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.429262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430183 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/225816b5-e7fe-4d29-84ad-37187e904104-system-cni-dir\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-proxy-tls\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de2167ca-ad7e-47ce-bf95-cebc396df145-multus-cni-dir\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430292 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430485 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-cni-binary-copy\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430927 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/225816b5-e7fe-4d29-84ad-37187e904104-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.430936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-mcd-auth-proxy-config\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431252 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w46z\" (UniqueName: \"kubernetes.io/projected/def35a55-2212-4d8e-8040-69fdcc95e34c-kube-api-access-9w46z\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431328 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-rootfs\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2pk\" (UniqueName: \"kubernetes.io/projected/f7291e3d-2994-409e-972a-59394140b3ad-kube-api-access-8b2pk\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6cxf\" (UniqueName: \"kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431527 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431545 4755 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431559 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431572 4755 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431583 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431595 4755 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431607 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431619 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431630 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431642 4755 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431655 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431673 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431685 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431697 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431707 4755 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431719 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431729 4755 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431748 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431759 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431770 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431780 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431791 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431803 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431817 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431831 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431847 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431865 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431877 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431888 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431901 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431912 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431926 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431938 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431957 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431969 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431981 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.431994 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432006 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432023 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432035 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432047 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432058 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432069 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432083 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432095 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432116 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432128 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432139 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432150 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432161 4755 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432172 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432186 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432213 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432223 4755 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432234 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432256 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432266 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432277 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432287 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432298 4755 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432308 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432319 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432331 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432342 4755 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432352 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432366 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432376 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432386 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432396 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432407 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432417 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432427 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432457 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432536 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-rootfs\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.432469 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433129 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433141 4755 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433152 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433163 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433176 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433186 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433198 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433210 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433244 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433256 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433267 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433279 4755 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433291 4755 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433303 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433322 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433341 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433353 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433368 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433380 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433392 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433404 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433415 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433427 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433455 4755 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433468 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.433794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de2167ca-ad7e-47ce-bf95-cebc396df145-cni-binary-copy\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.435193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.435392 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c32c8c1b-db30-4059-97d0-ef753de5e7e0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.435389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-proxy-tls\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.455855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48586\" (UniqueName: \"kubernetes.io/projected/2de863ac-0be1-45c8-9e03-56aa0fe9a23d-kube-api-access-48586\") pod \"machine-config-daemon-bhh2x\" (UID: \"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\") " pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.477202 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpscc\" (UniqueName: \"kubernetes.io/projected/becc22f3-961c-4ce7-b97f-6d40e28c9373-kube-api-access-gpscc\") pod \"node-ca-hgfhx\" (UID: \"becc22f3-961c-4ce7-b97f-6d40e28c9373\") " pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.480731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bzw5\" (UniqueName: \"kubernetes.io/projected/c32c8c1b-db30-4059-97d0-ef753de5e7e0-kube-api-access-8bzw5\") pod \"ovnkube-control-plane-749d76644c-mpdkw\" (UID: \"c32c8c1b-db30-4059-97d0-ef753de5e7e0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.480981 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.487674 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mdp\" (UniqueName: \"kubernetes.io/projected/225816b5-e7fe-4d29-84ad-37187e904104-kube-api-access-76mdp\") pod \"multus-additional-cni-plugins-nfp88\" (UID: \"225816b5-e7fe-4d29-84ad-37187e904104\") " pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.488731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6cxf\" (UniqueName: \"kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf\") pod \"ovnkube-node-mvdzt\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.489214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w46z\" (UniqueName: \"kubernetes.io/projected/def35a55-2212-4d8e-8040-69fdcc95e34c-kube-api-access-9w46z\") pod \"node-resolver-7cncb\" (UID: \"def35a55-2212-4d8e-8040-69fdcc95e34c\") " pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.491935 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2pk\" (UniqueName: \"kubernetes.io/projected/f7291e3d-2994-409e-972a-59394140b3ad-kube-api-access-8b2pk\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.492199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgfzg\" (UniqueName: \"kubernetes.io/projected/de2167ca-ad7e-47ce-bf95-cebc396df145-kube-api-access-zgfzg\") pod \"multus-j6qtr\" (UID: \"de2167ca-ad7e-47ce-bf95-cebc396df145\") " pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.498014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.526871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.526930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.526948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.526973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.526990 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.552600 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.563816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.566791 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-dc0ad9e874b0a6c74a77f4a73479c951d61cfd927279023a47ddc88cc55baf84 WatchSource:0}: Error finding container dc0ad9e874b0a6c74a77f4a73479c951d61cfd927279023a47ddc88cc55baf84: Status 404 returned error can't find the container with id dc0ad9e874b0a6c74a77f4a73479c951d61cfd927279023a47ddc88cc55baf84 Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.571476 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 17 00:23:58 crc kubenswrapper[4755]: else Mar 17 00:23:58 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 00:23:58 crc kubenswrapper[4755]: exit 1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.573193 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.573310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7cncb" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.579710 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5a2e10c1724f7c1fa6ddc74e290eb4f2d49afdba5802f07f1e2ad5bc5ebf14b7 WatchSource:0}: Error finding container 5a2e10c1724f7c1fa6ddc74e290eb4f2d49afdba5802f07f1e2ad5bc5ebf14b7: Status 404 returned error can't find the container with id 5a2e10c1724f7c1fa6ddc74e290eb4f2d49afdba5802f07f1e2ad5bc5ebf14b7 Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.581308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.588364 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 00:23:58 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 00:23:58 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 00:23:58 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 00:23:58 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ho_enable} \ Mar 17 00:23:58 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:23:58 crc kubenswrapper[4755]: --disable-approver \ Mar 17 00:23:58 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 17 00:23:58 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.589728 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef35a55_2212_4d8e_8040_69fdcc95e34c.slice/crio-4e0e43a9d03fb7d0914e37af97508a409e5e0afc0d367b56784f86f6245c6c32 WatchSource:0}: Error finding container 4e0e43a9d03fb7d0914e37af97508a409e5e0afc0d367b56784f86f6245c6c32: Status 404 returned error can't find the container with id 4e0e43a9d03fb7d0914e37af97508a409e5e0afc0d367b56784f86f6245c6c32 Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.591086 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --disable-webhook \ Mar 17 00:23:58 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.592247 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.594691 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32c8c1b_db30_4059_97d0_ef753de5e7e0.slice/crio-9d9145a76ff100f786c848001b027ae53934a9a14dd2e97fe886e55faabf8d97 WatchSource:0}: Error finding container 9d9145a76ff100f786c848001b027ae53934a9a14dd2e97fe886e55faabf8d97: Status 404 returned error can't find the container with id 9d9145a76ff100f786c848001b027ae53934a9a14dd2e97fe886e55faabf8d97 Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.595519 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -uo pipefail Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 00:23:58 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 17 00:23:58 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 17 00:23:58 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 17 00:23:58 crc kubenswrapper[4755]: exit 1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: while true; do Mar 17 00:23:58 crc kubenswrapper[4755]: declare -A svc_ips Mar 17 00:23:58 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 17 00:23:58 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 00:23:58 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 00:23:58 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 00:23:58 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 00:23:58 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 00:23:58 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 17 00:23:58 crc kubenswrapper[4755]: do Mar 17 00:23:58 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 17 00:23:58 crc kubenswrapper[4755]: break Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 17 00:23:58 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 00:23:58 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 00:23:58 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 00:23:58 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: continue Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Append resolver entries for services Mar 17 00:23:58 crc kubenswrapper[4755]: rc=0 Mar 17 00:23:58 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 17 00:23:58 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 17 00:23:58 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: continue Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 00:23:58 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 17 00:23:58 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 00:23:58 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: unset svc_ips Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w46z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7cncb_openshift-dns(def35a55-2212-4d8e-8040-69fdcc95e34c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.596600 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7cncb" podUID="def35a55-2212-4d8e-8040-69fdcc95e34c" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.597832 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -euo pipefail Mar 17 00:23:58 crc kubenswrapper[4755]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 17 00:23:58 crc kubenswrapper[4755]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 17 00:23:58 crc kubenswrapper[4755]: # As the secret mount is optional we must wait for the files to be present. Mar 17 00:23:58 crc kubenswrapper[4755]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 17 00:23:58 crc kubenswrapper[4755]: TS=$(date +%s) Mar 17 00:23:58 crc kubenswrapper[4755]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 17 00:23:58 crc kubenswrapper[4755]: HAS_LOGGED_INFO=0 Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: log_missing_certs(){ Mar 17 00:23:58 crc kubenswrapper[4755]: CUR_TS=$(date +%s) Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 17 00:23:58 crc kubenswrapper[4755]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 17 00:23:58 crc kubenswrapper[4755]: HAS_LOGGED_INFO=1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: } Mar 17 00:23:58 crc kubenswrapper[4755]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 17 00:23:58 crc kubenswrapper[4755]: log_missing_certs Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 5 Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/kube-rbac-proxy \ Mar 17 00:23:58 crc kubenswrapper[4755]: --logtostderr \ Mar 17 00:23:58 crc kubenswrapper[4755]: --secure-listen-address=:9108 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --upstream=http://127.0.0.1:29108/ \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-private-key-file=${TLS_PK} \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-cert-file=${TLS_CERT} Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.599897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.603040 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "false" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: persistent_ips_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # This is needed so that converting clusters from GA to TP Mar 17 00:23:58 crc kubenswrapper[4755]: # will rollout control plane pods as well Mar 17 00:23:58 crc kubenswrapper[4755]: network_segmentation_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: multi_network_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: multi_network_enabled_flag="--enable-multi-network" Mar 17 00:23:58 crc kubenswrapper[4755]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube \ Mar 17 00:23:58 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:23:58 crc kubenswrapper[4755]: --init-cluster-manager "${K8S_NODE}" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-bind-address "127.0.0.1:29108" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-enable-pprof \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-enable-config-duration \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v4_join_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v6_join_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${dns_name_resolver_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${persistent_ips_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${multi_network_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${network_segmentation_enabled_flag} Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.604625 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" podUID="c32c8c1b-db30-4059-97d0-ef753de5e7e0" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.613390 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j6qtr" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.620142 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ef52a6b2d30d2d3a921ec379e3541471a7a83074b9250f44b46113999ad08c05 WatchSource:0}: Error finding container ef52a6b2d30d2d3a921ec379e3541471a7a83074b9250f44b46113999ad08c05: Status 404 returned error can't find the container with id ef52a6b2d30d2d3a921ec379e3541471a7a83074b9250f44b46113999ad08c05 Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.622047 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hgfhx" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.625071 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.626324 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.629563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.629593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.629606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.629620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.629632 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.634460 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 00:23:58 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 00:23:58 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-j6qtr_openshift-multus(de2167ca-ad7e-47ce-bf95-cebc396df145): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.635736 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-j6qtr" podUID="de2167ca-ad7e-47ce-bf95-cebc396df145" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.643270 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecc22f3_961c_4ce7_b97f_6d40e28c9373.slice/crio-a883b2395a46bb27fc1a713c6f77e635e1863e72f87d555cf984eba0a6c8dd81 WatchSource:0}: Error finding container a883b2395a46bb27fc1a713c6f77e635e1863e72f87d555cf984eba0a6c8dd81: Status 404 returned error can't find the container with id a883b2395a46bb27fc1a713c6f77e635e1863e72f87d555cf984eba0a6c8dd81 Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.645614 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 00:23:58 crc kubenswrapper[4755]: while [ true ]; Mar 17 00:23:58 crc kubenswrapper[4755]: do Mar 17 00:23:58 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 17 00:23:58 crc kubenswrapper[4755]: echo $f Mar 17 00:23:58 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 17 00:23:58 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 00:23:58 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 00:23:58 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 17 00:23:58 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 00:23:58 crc kubenswrapper[4755]: else Mar 17 00:23:58 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 17 00:23:58 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 17 00:23:58 crc kubenswrapper[4755]: echo $d Mar 17 00:23:58 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 00:23:58 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 00:23:58 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 00:23:58 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpscc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hgfhx_openshift-image-registry(becc22f3-961c-4ce7-b97f-6d40e28c9373): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.646763 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hgfhx" podUID="becc22f3-961c-4ce7-b97f-6d40e28c9373" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.664116 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.669154 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.673424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfp88" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.676783 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de863ac_0be1_45c8_9e03_56aa0fe9a23d.slice/crio-0a2ec6e9f427eb1257e97a4f50e5dea5487152d70f643ee245203e3a414968ec WatchSource:0}: Error finding container 0a2ec6e9f427eb1257e97a4f50e5dea5487152d70f643ee245203e3a414968ec: Status 404 returned error can't find the container with id 0a2ec6e9f427eb1257e97a4f50e5dea5487152d70f643ee245203e3a414968ec Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.679130 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" event={"ID":"c32c8c1b-db30-4059-97d0-ef753de5e7e0","Type":"ContainerStarted","Data":"9d9145a76ff100f786c848001b027ae53934a9a14dd2e97fe886e55faabf8d97"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.680374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5a2e10c1724f7c1fa6ddc74e290eb4f2d49afdba5802f07f1e2ad5bc5ebf14b7"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.681860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgfhx" event={"ID":"becc22f3-961c-4ce7-b97f-6d40e28c9373","Type":"ContainerStarted","Data":"a883b2395a46bb27fc1a713c6f77e635e1863e72f87d555cf984eba0a6c8dd81"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.683067 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -euo pipefail Mar 17 00:23:58 crc kubenswrapper[4755]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 17 00:23:58 crc kubenswrapper[4755]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 17 00:23:58 crc kubenswrapper[4755]: # As the secret mount is optional we must wait for the files to be present. Mar 17 00:23:58 crc kubenswrapper[4755]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 17 00:23:58 crc kubenswrapper[4755]: TS=$(date +%s) Mar 17 00:23:58 crc kubenswrapper[4755]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 17 00:23:58 crc kubenswrapper[4755]: HAS_LOGGED_INFO=0 Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: log_missing_certs(){ Mar 17 00:23:58 crc kubenswrapper[4755]: CUR_TS=$(date +%s) Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 17 00:23:58 crc kubenswrapper[4755]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 17 00:23:58 crc kubenswrapper[4755]: HAS_LOGGED_INFO=1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: } Mar 17 00:23:58 crc kubenswrapper[4755]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 17 00:23:58 crc kubenswrapper[4755]: log_missing_certs Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 5 Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/kube-rbac-proxy \ Mar 17 00:23:58 crc kubenswrapper[4755]: --logtostderr \ Mar 17 00:23:58 crc kubenswrapper[4755]: --secure-listen-address=:9108 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --upstream=http://127.0.0.1:29108/ \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-private-key-file=${TLS_PK} \ Mar 17 00:23:58 crc kubenswrapper[4755]: --tls-cert-file=${TLS_CERT} Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.683063 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48586,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.683185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6qtr" event={"ID":"de2167ca-ad7e-47ce-bf95-cebc396df145","Type":"ContainerStarted","Data":"affc2663b83e842f0f28116bf0e5040839afea8d2f9a013dfbba42914883644e"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.684968 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7cncb" event={"ID":"def35a55-2212-4d8e-8040-69fdcc95e34c","Type":"ContainerStarted","Data":"4e0e43a9d03fb7d0914e37af97508a409e5e0afc0d367b56784f86f6245c6c32"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.685083 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 00:23:58 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 00:23:58 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 00:23:58 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 00:23:58 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ho_enable} \ Mar 17 00:23:58 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:23:58 crc kubenswrapper[4755]: --disable-approver \ Mar 17 00:23:58 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 17 00:23:58 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.685893 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -uo pipefail Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 00:23:58 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 17 00:23:58 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 17 00:23:58 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 00:23:58 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 17 00:23:58 crc kubenswrapper[4755]: exit 1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: while true; do Mar 17 00:23:58 crc kubenswrapper[4755]: declare -A svc_ips Mar 17 00:23:58 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 17 00:23:58 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 00:23:58 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 00:23:58 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 00:23:58 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 00:23:58 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:23:58 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 00:23:58 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 17 00:23:58 crc kubenswrapper[4755]: do Mar 17 00:23:58 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 17 00:23:58 crc kubenswrapper[4755]: break Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 17 00:23:58 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 00:23:58 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 00:23:58 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 00:23:58 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: continue Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # Append resolver entries for services Mar 17 00:23:58 crc kubenswrapper[4755]: rc=0 Mar 17 00:23:58 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 17 00:23:58 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 17 00:23:58 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: continue Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 00:23:58 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 17 00:23:58 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 00:23:58 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:23:58 crc kubenswrapper[4755]: unset svc_ips Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w46z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7cncb_openshift-dns(def35a55-2212-4d8e-8040-69fdcc95e34c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.685948 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 00:23:58 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 00:23:58 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-j6qtr_openshift-multus(de2167ca-ad7e-47ce-bf95-cebc396df145): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.685991 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 00:23:58 crc kubenswrapper[4755]: while [ true ]; Mar 17 00:23:58 crc kubenswrapper[4755]: do Mar 17 00:23:58 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 17 00:23:58 crc kubenswrapper[4755]: echo $f Mar 17 00:23:58 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 17 00:23:58 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 00:23:58 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 00:23:58 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 17 00:23:58 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 00:23:58 crc kubenswrapper[4755]: else Mar 17 00:23:58 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 17 00:23:58 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 17 00:23:58 crc kubenswrapper[4755]: echo $d Mar 17 00:23:58 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 00:23:58 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 00:23:58 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 00:23:58 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 17 00:23:58 crc kubenswrapper[4755]: done Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpscc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hgfhx_openshift-image-registry(becc22f3-961c-4ce7-b97f-6d40e28c9373): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.686218 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "false" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: persistent_ips_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: # This is needed so that converting clusters from GA to TP Mar 17 00:23:58 crc kubenswrapper[4755]: # will rollout control plane pods as well Mar 17 00:23:58 crc kubenswrapper[4755]: network_segmentation_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: multi_network_enabled_flag= Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: multi_network_enabled_flag="--enable-multi-network" Mar 17 00:23:58 crc kubenswrapper[4755]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube \ Mar 17 00:23:58 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:23:58 crc kubenswrapper[4755]: --init-cluster-manager "${K8S_NODE}" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-bind-address "127.0.0.1:29108" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-enable-pprof \ Mar 17 00:23:58 crc kubenswrapper[4755]: --metrics-enable-config-duration \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v4_join_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v6_join_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${dns_name_resolver_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${persistent_ips_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${multi_network_enabled_flag} \ Mar 17 00:23:58 crc kubenswrapper[4755]: ${network_segmentation_enabled_flag} Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.686306 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48586,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.686554 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 00:23:58 crc kubenswrapper[4755]: apiVersion: v1 Mar 17 00:23:58 crc kubenswrapper[4755]: clusters: Mar 17 00:23:58 crc kubenswrapper[4755]: - cluster: Mar 17 00:23:58 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 00:23:58 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 17 00:23:58 crc kubenswrapper[4755]: name: default-cluster Mar 17 00:23:58 crc kubenswrapper[4755]: contexts: Mar 17 00:23:58 crc kubenswrapper[4755]: - context: Mar 17 00:23:58 crc kubenswrapper[4755]: cluster: default-cluster Mar 17 00:23:58 crc kubenswrapper[4755]: namespace: default Mar 17 00:23:58 crc kubenswrapper[4755]: user: default-auth Mar 17 00:23:58 crc kubenswrapper[4755]: name: default-context Mar 17 00:23:58 crc kubenswrapper[4755]: current-context: default-context Mar 17 00:23:58 crc kubenswrapper[4755]: kind: Config Mar 17 00:23:58 crc kubenswrapper[4755]: preferences: {} Mar 17 00:23:58 crc kubenswrapper[4755]: users: Mar 17 00:23:58 crc kubenswrapper[4755]: - name: default-auth Mar 17 00:23:58 crc kubenswrapper[4755]: user: Mar 17 00:23:58 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:23:58 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:23:58 crc kubenswrapper[4755]: EOF Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6cxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-mvdzt_openshift-ovn-kubernetes(44d329be-573d-4143-97fb-d07ed343c898): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.687074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dc0ad9e874b0a6c74a77f4a73479c951d61cfd927279023a47ddc88cc55baf84"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.687839 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688107 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688157 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" podUID="c32c8c1b-db30-4059-97d0-ef753de5e7e0" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688177 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-j6qtr" podUID="de2167ca-ad7e-47ce-bf95-cebc396df145" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688214 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7cncb" podUID="def35a55-2212-4d8e-8040-69fdcc95e34c" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688416 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hgfhx" podUID="becc22f3-961c-4ce7-b97f-6d40e28c9373" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.688473 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:23:58 crc kubenswrapper[4755]: set +o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: Mar 17 00:23:58 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:23:58 crc kubenswrapper[4755]: --disable-webhook \ Mar 17 00:23:58 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 00:23:58 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.689099 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:58 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 00:23:58 crc kubenswrapper[4755]: set -o allexport Mar 17 00:23:58 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 00:23:58 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 17 00:23:58 crc kubenswrapper[4755]: else Mar 17 00:23:58 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 00:23:58 crc kubenswrapper[4755]: exit 1 Mar 17 00:23:58 crc kubenswrapper[4755]: fi Mar 17 00:23:58 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 00:23:58 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:58 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.690213 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.690672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ef52a6b2d30d2d3a921ec379e3541471a7a83074b9250f44b46113999ad08c05"} Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.690848 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.693868 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.694486 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: W0317 00:23:58.695624 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225816b5_e7fe_4d29_84ad_37187e904104.slice/crio-4de3039f6f400ee9dfa21f56bcc33ea7c84200c798f2eaddddf38e73eef48069 WatchSource:0}: Error finding container 4de3039f6f400ee9dfa21f56bcc33ea7c84200c798f2eaddddf38e73eef48069: Status 404 returned error can't find the container with id 4de3039f6f400ee9dfa21f56bcc33ea7c84200c798f2eaddddf38e73eef48069 Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.695638 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.698136 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76mdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nfp88_openshift-multus(225816b5-e7fe-4d29-84ad-37187e904104): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.699399 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nfp88" podUID="225816b5-e7fe-4d29-84ad-37187e904104" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.709397 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.717768 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.729654 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.731820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.731964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.732065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.732152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.732246 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.742544 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.753600 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.763119 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.773291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.780684 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.788523 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.796584 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.816145 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.825613 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.834708 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.835492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.835518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.835527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.835540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.835548 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.837417 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.837561 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.837543324 +0000 UTC m=+114.596995607 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.837628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.837667 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.837700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.837793 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.837836 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.837825722 +0000 UTC m=+114.597278005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.837986 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.838993 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.838964189 +0000 UTC m=+114.598416542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.838066 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.839160 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.839192 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.839473 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.839273507 +0000 UTC m=+114.598725860 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.845470 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.853503 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.862561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.871097 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.877925 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.883431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.895982 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.903820 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.913390 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.923622 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.932589 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938157 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938206 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938344 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938371 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938386 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938348 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938484 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.938463224 +0000 UTC m=+114.697915507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:58 crc kubenswrapper[4755]: E0317 00:23:58.938526 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:23:59.938501515 +0000 UTC m=+114.697953828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938562 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.938574 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:58Z","lastTransitionTime":"2026-03-17T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.941070 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:58 crc kubenswrapper[4755]: I0317 00:23:58.962462 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.001148 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.041676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.041736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.041753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.041776 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.041793 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.145050 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.145135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.145153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.145179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.145197 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.248331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.248391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.248411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.248485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.248525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.351737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.351801 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.351813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.351839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.351853 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.456079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.456133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.456158 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.456184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.456202 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.559914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.559962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.559979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.560002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.560018 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.656480 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.663296 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.663353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.663370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.663391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.663413 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.674758 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.675366 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.675567 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.676913 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.686982 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.695589 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"0a2ec6e9f427eb1257e97a4f50e5dea5487152d70f643ee245203e3a414968ec"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.697797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerStarted","Data":"4de3039f6f400ee9dfa21f56bcc33ea7c84200c798f2eaddddf38e73eef48069"} Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.697883 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48586,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.699411 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.699613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"298e6b83a9fd3017c1ae942a1bb9db3386c157608c11d666d65352005c431160"} Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.701224 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76mdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nfp88_openshift-multus(225816b5-e7fe-4d29-84ad-37187e904104): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.701837 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48586,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.702718 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nfp88" podUID="225816b5-e7fe-4d29-84ad-37187e904104" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.702976 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.703642 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:23:59 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 00:23:59 crc kubenswrapper[4755]: apiVersion: v1 Mar 17 00:23:59 crc kubenswrapper[4755]: clusters: Mar 17 00:23:59 crc kubenswrapper[4755]: - cluster: Mar 17 00:23:59 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 00:23:59 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 17 00:23:59 crc kubenswrapper[4755]: name: default-cluster Mar 17 00:23:59 crc kubenswrapper[4755]: contexts: Mar 17 00:23:59 crc kubenswrapper[4755]: - context: Mar 17 00:23:59 crc kubenswrapper[4755]: cluster: default-cluster Mar 17 00:23:59 crc kubenswrapper[4755]: namespace: default Mar 17 00:23:59 crc kubenswrapper[4755]: user: default-auth Mar 17 00:23:59 crc kubenswrapper[4755]: name: default-context Mar 17 00:23:59 crc kubenswrapper[4755]: current-context: default-context Mar 17 00:23:59 crc kubenswrapper[4755]: kind: Config Mar 17 00:23:59 crc kubenswrapper[4755]: preferences: {} Mar 17 00:23:59 crc kubenswrapper[4755]: users: Mar 17 00:23:59 crc kubenswrapper[4755]: - name: default-auth Mar 17 00:23:59 crc kubenswrapper[4755]: user: Mar 17 00:23:59 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:23:59 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:23:59 crc kubenswrapper[4755]: EOF Mar 17 00:23:59 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6cxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-mvdzt_openshift-ovn-kubernetes(44d329be-573d-4143-97fb-d07ed343c898): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:23:59 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.705270 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.705634 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.705883 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.719264 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.733747 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.744280 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.757156 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766299 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766516 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.766543 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.773750 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.781602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.793015 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.816779 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.829459 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.840701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.849726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.849854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.849911 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.849936 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.849911651 +0000 UTC m=+116.609363944 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850015 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.850044 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850094 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850210 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.850181418 +0000 UTC m=+116.609633731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850316 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850341 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850361 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850419 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.850396164 +0000 UTC m=+116.609848487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.850500 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.850487556 +0000 UTC m=+116.609939989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.852141 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.861057 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.869632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.869698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.869720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.869744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.869762 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.872140 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.884487 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.893750 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.904298 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.912687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.923998 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.941275 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.950918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.951029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951142 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951201 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951227 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951161 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951307 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.951280113 +0000 UTC m=+116.710732436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:23:59 crc kubenswrapper[4755]: E0317 00:23:59.951341 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:24:01.951325254 +0000 UTC m=+116.710777567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.971914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.972020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.972040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.972065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.972125 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:23:59Z","lastTransitionTime":"2026-03-17T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:23:59 crc kubenswrapper[4755]: I0317 00:23:59.982649 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.027533 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.072120 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.074533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.074589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.074606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.074655 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.074674 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.108884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.143073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.176646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.176679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.176687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.176702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.176711 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.185183 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.247843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.247854 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:00 crc kubenswrapper[4755]: E0317 00:24:00.247953 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:00 crc kubenswrapper[4755]: E0317 00:24:00.248017 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.247870 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:00 crc kubenswrapper[4755]: E0317 00:24:00.248112 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.248161 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:00 crc kubenswrapper[4755]: E0317 00:24:00.248208 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.252313 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.253186 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.253967 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.254751 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.255350 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.255946 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.256573 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.257188 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.258253 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.259257 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.260527 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.261847 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.262940 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.263576 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.264748 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.265289 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.266278 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.266712 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.267286 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.268311 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.268821 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.269356 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.270163 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.270806 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.271656 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.272259 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.273247 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.273713 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.274884 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.275380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.275991 4755 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.276106 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.278971 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.279574 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.279612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.279622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.279637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.279650 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.280347 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.281629 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.283570 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.284297 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.285283 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.285975 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.287037 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.287531 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.288526 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.289195 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.290150 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.290619 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.291544 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.292092 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.293219 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.293751 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.294645 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.295106 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.295625 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.296591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.297044 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.384544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.384652 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.384670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.384696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.384713 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.488390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.488497 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.488515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.488550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.488587 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.592378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.592460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.592478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.592506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.592524 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.696089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.696148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.696165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.696188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.696208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.799507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.799578 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.799595 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.799620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.799638 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.902727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.902793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.902815 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.902846 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:00 crc kubenswrapper[4755]: I0317 00:24:00.902867 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:00Z","lastTransitionTime":"2026-03-17T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.006350 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.006417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.006433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.006491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.006509 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.109862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.109924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.109941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.109966 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.109983 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.213405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.213489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.213509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.213531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.213551 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.316191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.316247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.316263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.316285 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.316301 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.419401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.419472 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.419489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.419510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.419525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.522522 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.522631 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.522645 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.522669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.522682 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.626294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.626355 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.626366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.626388 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.626400 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.729781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.729833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.729843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.729864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.729877 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.832465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.832538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.832552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.832574 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.832587 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.873248 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.873478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.873523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.873593 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873705 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873776 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873834 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873882 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873724 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.873682382 +0000 UTC m=+120.633134705 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.873907 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.874001 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.873933658 +0000 UTC m=+120.633385981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.874073 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.874051231 +0000 UTC m=+120.633503754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.874109 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.874090922 +0000 UTC m=+120.633543415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.934920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.934991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.935008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.935035 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.935055 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:01Z","lastTransitionTime":"2026-03-17T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.975700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:01 crc kubenswrapper[4755]: I0317 00:24:01.975759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.975903 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.975952 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.975937855 +0000 UTC m=+120.735390138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.976020 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.976029 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.976039 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:01 crc kubenswrapper[4755]: E0317 00:24:01.976062 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:05.976056058 +0000 UTC m=+120.735508341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.038361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.038420 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.038465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.038483 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.038497 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.142266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.142716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.142880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.143041 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.143220 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.250411 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.250605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.250830 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.250841 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:02 crc kubenswrapper[4755]: E0317 00:24:02.250827 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:02 crc kubenswrapper[4755]: E0317 00:24:02.251025 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:02 crc kubenswrapper[4755]: E0317 00:24:02.251125 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:02 crc kubenswrapper[4755]: E0317 00:24:02.251214 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.251677 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.252395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.252481 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.252510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.252523 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.356151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.356205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.356221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.356245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.356261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.459722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.460036 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.460136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.460233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.460316 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.563692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.563778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.563805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.563836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.563859 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.667309 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.667383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.667407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.667471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.667492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.771257 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.771299 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.771311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.771327 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.771344 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.874135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.874191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.874206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.874227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.874243 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.977126 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.977183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.977202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.977230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:02 crc kubenswrapper[4755]: I0317 00:24:02.977248 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:02Z","lastTransitionTime":"2026-03-17T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.079992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.080078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.080101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.080588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.080855 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.184556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.184609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.184632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.184660 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.184681 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.287888 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.287944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.287964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.287990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.288010 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.320711 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.322147 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.322701 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.392379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.392459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.392472 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.392498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.392513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.495590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.495650 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.495659 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.495671 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.495679 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.598227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.598308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.598333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.598360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.598377 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.701383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.701506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.701525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.701548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.701566 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.805110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.805151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.805161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.805176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.805188 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.837852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.837897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.837908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.837925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.837937 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.849884 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.855394 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.855556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.855584 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.855609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.855628 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.873652 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.879367 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.879426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.879467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.879491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.879509 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.894501 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.899523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.899586 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.899605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.899630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.899647 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.919579 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.924694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.924767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.924785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.924808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.924826 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.937508 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:03 crc kubenswrapper[4755]: E0317 00:24:03.937731 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.939872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.939936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.939954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.939979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:03 crc kubenswrapper[4755]: I0317 00:24:03.940001 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:03Z","lastTransitionTime":"2026-03-17T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.043284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.043365 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.043391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.043418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.043473 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.146175 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.146242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.146263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.146295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.146313 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.247880 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:04 crc kubenswrapper[4755]: E0317 00:24:04.248052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.248189 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.248199 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.248346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:04 crc kubenswrapper[4755]: E0317 00:24:04.248586 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:04 crc kubenswrapper[4755]: E0317 00:24:04.248742 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:04 crc kubenswrapper[4755]: E0317 00:24:04.248874 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.250326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.250384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.250406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.250428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.250479 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.353969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.354013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.354027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.354061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.354074 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.456826 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.456867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.456880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.456894 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.456905 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.560039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.560100 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.560121 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.560150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.560173 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.662859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.662925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.662942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.662964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.662980 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.765922 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.766004 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.766029 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.766060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.766085 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.868981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.869042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.869061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.869085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.869103 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.972476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.972539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.972556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.972580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:04 crc kubenswrapper[4755]: I0317 00:24:04.972597 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:04Z","lastTransitionTime":"2026-03-17T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.076050 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.076118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.076140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.076169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.076191 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.179796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.180143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.180229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.180342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.180465 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.283552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.283599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.283616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.283634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.283646 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.386011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.386044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.386053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.386065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.386074 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.489297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.489363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.489376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.489395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.489406 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.592120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.592164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.592178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.592193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.592205 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.694737 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.694803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.694825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.694852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.694875 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.797727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.797780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.797798 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.797824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.797840 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.900984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.901064 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.901086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.901115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.901136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:05Z","lastTransitionTime":"2026-03-17T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.916702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.916833 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.916897 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:13.916859167 +0000 UTC m=+128.676311470 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.916945 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.916960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917011 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:13.91698832 +0000 UTC m=+128.676440643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:05 crc kubenswrapper[4755]: I0317 00:24:05.917109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917245 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917292 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917320 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917313 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917421 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:13.917393949 +0000 UTC m=+128.676846322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:05 crc kubenswrapper[4755]: E0317 00:24:05.917512 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:13.917489352 +0000 UTC m=+128.676941715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.004876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.004981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.005000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.005024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.005043 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:06Z","lastTransitionTime":"2026-03-17T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.017755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.017890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.017968 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.018016 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.018026 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.018054 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.018101 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:24:14.018082824 +0000 UTC m=+128.777535117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.018136 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:14.018111564 +0000 UTC m=+128.777563887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.108409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.108518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.108540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.108567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.108590 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:06Z","lastTransitionTime":"2026-03-17T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.209599 4755 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.247363 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.247541 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.247619 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.247715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.247748 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.247885 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.247930 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.247984 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.263048 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.275879 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.285922 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.299745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.317663 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.332561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: E0317 00:24:06.341930 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.347643 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.357280 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.373185 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.382972 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.395026 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.405103 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.427819 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.439648 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:06 crc kubenswrapper[4755]: I0317 00:24:06.452428 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:08 crc kubenswrapper[4755]: I0317 00:24:08.247590 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:08 crc kubenswrapper[4755]: I0317 00:24:08.247720 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:08 crc kubenswrapper[4755]: I0317 00:24:08.247594 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:08 crc kubenswrapper[4755]: E0317 00:24:08.247845 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:08 crc kubenswrapper[4755]: E0317 00:24:08.248014 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:08 crc kubenswrapper[4755]: I0317 00:24:08.248079 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:08 crc kubenswrapper[4755]: E0317 00:24:08.248310 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:08 crc kubenswrapper[4755]: E0317 00:24:08.248137 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:09 crc kubenswrapper[4755]: E0317 00:24:09.250613 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 00:24:09 crc kubenswrapper[4755]: E0317 00:24:09.253175 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 00:24:09 crc kubenswrapper[4755]: I0317 00:24:09.261207 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 17 00:24:10 crc kubenswrapper[4755]: I0317 00:24:10.247266 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:10 crc kubenswrapper[4755]: I0317 00:24:10.247381 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.247528 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:10 crc kubenswrapper[4755]: I0317 00:24:10.247546 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:10 crc kubenswrapper[4755]: I0317 00:24:10.247615 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.247818 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.248039 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.248719 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.250675 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:10 crc kubenswrapper[4755]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 17 00:24:10 crc kubenswrapper[4755]: set -euo pipefail Mar 17 00:24:10 crc kubenswrapper[4755]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 17 00:24:10 crc kubenswrapper[4755]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 17 00:24:10 crc kubenswrapper[4755]: # As the secret mount is optional we must wait for the files to be present. Mar 17 00:24:10 crc kubenswrapper[4755]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 17 00:24:10 crc kubenswrapper[4755]: TS=$(date +%s) Mar 17 00:24:10 crc kubenswrapper[4755]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 17 00:24:10 crc kubenswrapper[4755]: HAS_LOGGED_INFO=0 Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: log_missing_certs(){ Mar 17 00:24:10 crc kubenswrapper[4755]: CUR_TS=$(date +%s) Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 17 00:24:10 crc kubenswrapper[4755]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 17 00:24:10 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 17 00:24:10 crc kubenswrapper[4755]: HAS_LOGGED_INFO=1 Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: } Mar 17 00:24:10 crc kubenswrapper[4755]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 17 00:24:10 crc kubenswrapper[4755]: log_missing_certs Mar 17 00:24:10 crc kubenswrapper[4755]: sleep 5 Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 17 00:24:10 crc kubenswrapper[4755]: exec /usr/bin/kube-rbac-proxy \ Mar 17 00:24:10 crc kubenswrapper[4755]: --logtostderr \ Mar 17 00:24:10 crc kubenswrapper[4755]: --secure-listen-address=:9108 \ Mar 17 00:24:10 crc kubenswrapper[4755]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 17 00:24:10 crc kubenswrapper[4755]: --upstream=http://127.0.0.1:29108/ \ Mar 17 00:24:10 crc kubenswrapper[4755]: --tls-private-key-file=${TLS_PK} \ Mar 17 00:24:10 crc kubenswrapper[4755]: --tls-cert-file=${TLS_CERT} Mar 17 00:24:10 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:10 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.250914 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:10 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 00:24:10 crc kubenswrapper[4755]: set -uo pipefail Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 00:24:10 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 17 00:24:10 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 17 00:24:10 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 00:24:10 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 17 00:24:10 crc kubenswrapper[4755]: exit 1 Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: while true; do Mar 17 00:24:10 crc kubenswrapper[4755]: declare -A svc_ips Mar 17 00:24:10 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 17 00:24:10 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 00:24:10 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 00:24:10 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 00:24:10 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 00:24:10 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:24:10 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:24:10 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 00:24:10 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 00:24:10 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 17 00:24:10 crc kubenswrapper[4755]: do Mar 17 00:24:10 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 17 00:24:10 crc kubenswrapper[4755]: break Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 17 00:24:10 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 00:24:10 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 00:24:10 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 00:24:10 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 00:24:10 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:24:10 crc kubenswrapper[4755]: continue Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: # Append resolver entries for services Mar 17 00:24:10 crc kubenswrapper[4755]: rc=0 Mar 17 00:24:10 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 17 00:24:10 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 17 00:24:10 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:24:10 crc kubenswrapper[4755]: continue Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 00:24:10 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 17 00:24:10 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 00:24:10 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: sleep 60 & wait Mar 17 00:24:10 crc kubenswrapper[4755]: unset svc_ips Mar 17 00:24:10 crc kubenswrapper[4755]: done Mar 17 00:24:10 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w46z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7cncb_openshift-dns(def35a55-2212-4d8e-8040-69fdcc95e34c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:10 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.251552 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:10 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 00:24:10 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 00:24:10 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgfzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-j6qtr_openshift-multus(de2167ca-ad7e-47ce-bf95-cebc396df145): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:10 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.252622 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7cncb" podUID="def35a55-2212-4d8e-8040-69fdcc95e34c" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.252649 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-j6qtr" podUID="de2167ca-ad7e-47ce-bf95-cebc396df145" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.254093 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:10 crc kubenswrapper[4755]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: set -o allexport Mar 17 00:24:10 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:24:10 crc kubenswrapper[4755]: set +o allexport Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "" != "" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "false" == "true" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: persistent_ips_enabled_flag= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: # This is needed so that converting clusters from GA to TP Mar 17 00:24:10 crc kubenswrapper[4755]: # will rollout control plane pods as well Mar 17 00:24:10 crc kubenswrapper[4755]: network_segmentation_enabled_flag= Mar 17 00:24:10 crc kubenswrapper[4755]: multi_network_enabled_flag= Mar 17 00:24:10 crc kubenswrapper[4755]: if [[ "true" == "true" ]]; then Mar 17 00:24:10 crc kubenswrapper[4755]: multi_network_enabled_flag="--enable-multi-network" Mar 17 00:24:10 crc kubenswrapper[4755]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 17 00:24:10 crc kubenswrapper[4755]: fi Mar 17 00:24:10 crc kubenswrapper[4755]: Mar 17 00:24:10 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 17 00:24:10 crc kubenswrapper[4755]: exec /usr/bin/ovnkube \ Mar 17 00:24:10 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:24:10 crc kubenswrapper[4755]: --init-cluster-manager "${K8S_NODE}" \ Mar 17 00:24:10 crc kubenswrapper[4755]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 17 00:24:10 crc kubenswrapper[4755]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 17 00:24:10 crc kubenswrapper[4755]: --metrics-bind-address "127.0.0.1:29108" \ Mar 17 00:24:10 crc kubenswrapper[4755]: --metrics-enable-pprof \ Mar 17 00:24:10 crc kubenswrapper[4755]: --metrics-enable-config-duration \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${ovn_v4_join_subnet_opt} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${ovn_v6_join_subnet_opt} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${dns_name_resolver_enabled_flag} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${persistent_ips_enabled_flag} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${multi_network_enabled_flag} \ Mar 17 00:24:10 crc kubenswrapper[4755]: ${network_segmentation_enabled_flag} Mar 17 00:24:10 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bzw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-mpdkw_openshift-ovn-kubernetes(c32c8c1b-db30-4059-97d0-ef753de5e7e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:10 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:10 crc kubenswrapper[4755]: E0317 00:24:10.255395 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" podUID="c32c8c1b-db30-4059-97d0-ef753de5e7e0" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.249640 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:11 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 00:24:11 crc kubenswrapper[4755]: set -o allexport Mar 17 00:24:11 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 00:24:11 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 17 00:24:11 crc kubenswrapper[4755]: else Mar 17 00:24:11 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 00:24:11 crc kubenswrapper[4755]: exit 1 Mar 17 00:24:11 crc kubenswrapper[4755]: fi Mar 17 00:24:11 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 00:24:11 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:11 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.249934 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:11 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:24:11 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:24:11 crc kubenswrapper[4755]: set -o allexport Mar 17 00:24:11 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:24:11 crc kubenswrapper[4755]: set +o allexport Mar 17 00:24:11 crc kubenswrapper[4755]: fi Mar 17 00:24:11 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 00:24:11 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 00:24:11 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 17 00:24:11 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 00:24:11 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 00:24:11 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 00:24:11 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:24:11 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 00:24:11 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 17 00:24:11 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 17 00:24:11 crc kubenswrapper[4755]: ${ho_enable} \ Mar 17 00:24:11 crc kubenswrapper[4755]: --enable-interconnect \ Mar 17 00:24:11 crc kubenswrapper[4755]: --disable-approver \ Mar 17 00:24:11 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 00:24:11 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 17 00:24:11 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 00:24:11 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:24:11 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:11 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.251159 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.252618 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:11 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 00:24:11 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 17 00:24:11 crc kubenswrapper[4755]: set -o allexport Mar 17 00:24:11 crc kubenswrapper[4755]: source "/env/_master" Mar 17 00:24:11 crc kubenswrapper[4755]: set +o allexport Mar 17 00:24:11 crc kubenswrapper[4755]: fi Mar 17 00:24:11 crc kubenswrapper[4755]: Mar 17 00:24:11 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 00:24:11 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 00:24:11 crc kubenswrapper[4755]: --disable-webhook \ Mar 17 00:24:11 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 00:24:11 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 17 00:24:11 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:11 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.253961 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 00:24:11 crc kubenswrapper[4755]: E0317 00:24:11.343784 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:12 crc kubenswrapper[4755]: I0317 00:24:12.248212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:12 crc kubenswrapper[4755]: I0317 00:24:12.248306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.248589 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:12 crc kubenswrapper[4755]: I0317 00:24:12.248624 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:12 crc kubenswrapper[4755]: I0317 00:24:12.248655 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.248854 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.249025 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.249911 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.252135 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:12 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 00:24:12 crc kubenswrapper[4755]: while [ true ]; Mar 17 00:24:12 crc kubenswrapper[4755]: do Mar 17 00:24:12 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 17 00:24:12 crc kubenswrapper[4755]: echo $f Mar 17 00:24:12 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 17 00:24:12 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 00:24:12 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 00:24:12 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 17 00:24:12 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 00:24:12 crc kubenswrapper[4755]: else Mar 17 00:24:12 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 17 00:24:12 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 00:24:12 crc kubenswrapper[4755]: fi Mar 17 00:24:12 crc kubenswrapper[4755]: done Mar 17 00:24:12 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 17 00:24:12 crc kubenswrapper[4755]: echo $d Mar 17 00:24:12 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 00:24:12 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 00:24:12 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 00:24:12 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 17 00:24:12 crc kubenswrapper[4755]: fi Mar 17 00:24:12 crc kubenswrapper[4755]: done Mar 17 00:24:12 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 17 00:24:12 crc kubenswrapper[4755]: done Mar 17 00:24:12 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpscc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hgfhx_openshift-image-registry(becc22f3-961c-4ce7-b97f-6d40e28c9373): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:12 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.252249 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:24:12 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 00:24:12 crc kubenswrapper[4755]: apiVersion: v1 Mar 17 00:24:12 crc kubenswrapper[4755]: clusters: Mar 17 00:24:12 crc kubenswrapper[4755]: - cluster: Mar 17 00:24:12 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 00:24:12 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 17 00:24:12 crc kubenswrapper[4755]: name: default-cluster Mar 17 00:24:12 crc kubenswrapper[4755]: contexts: Mar 17 00:24:12 crc kubenswrapper[4755]: - context: Mar 17 00:24:12 crc kubenswrapper[4755]: cluster: default-cluster Mar 17 00:24:12 crc kubenswrapper[4755]: namespace: default Mar 17 00:24:12 crc kubenswrapper[4755]: user: default-auth Mar 17 00:24:12 crc kubenswrapper[4755]: name: default-context Mar 17 00:24:12 crc kubenswrapper[4755]: current-context: default-context Mar 17 00:24:12 crc kubenswrapper[4755]: kind: Config Mar 17 00:24:12 crc kubenswrapper[4755]: preferences: {} Mar 17 00:24:12 crc kubenswrapper[4755]: users: Mar 17 00:24:12 crc kubenswrapper[4755]: - name: default-auth Mar 17 00:24:12 crc kubenswrapper[4755]: user: Mar 17 00:24:12 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:24:12 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 00:24:12 crc kubenswrapper[4755]: EOF Mar 17 00:24:12 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6cxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-mvdzt_openshift-ovn-kubernetes(44d329be-573d-4143-97fb-d07ed343c898): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 00:24:12 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.253612 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hgfhx" podUID="becc22f3-961c-4ce7-b97f-6d40e28c9373" Mar 17 00:24:12 crc kubenswrapper[4755]: E0317 00:24:12.253661 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" Mar 17 00:24:12 crc kubenswrapper[4755]: I0317 00:24:12.351291 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.005298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.005395 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.005421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.005489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.005586 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.005632 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.005619122 +0000 UTC m=+144.765071405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.005933 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.005925779 +0000 UTC m=+144.765378062 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.005973 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.006009 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.006003861 +0000 UTC m=+144.765456144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.006332 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.006416 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.006483 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.006632 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.006590415 +0000 UTC m=+144.766042738 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.106354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.106481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107294 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107509 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107548 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107663 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.107631067 +0000 UTC m=+144.867083380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107782 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.107916 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:24:30.107869984 +0000 UTC m=+144.867322307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.247994 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.248087 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.248207 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.248381 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.248075 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.248566 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.249026 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.249590 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.339958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.340014 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.340032 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.340055 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.340072 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:14Z","lastTransitionTime":"2026-03-17T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.357302 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.363141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.363193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.363205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.363224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.363238 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:14Z","lastTransitionTime":"2026-03-17T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.375255 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.379955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.379998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.380010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.380027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.380038 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:14Z","lastTransitionTime":"2026-03-17T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.389635 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.393548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.393580 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.393588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.393601 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.393610 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:14Z","lastTransitionTime":"2026-03-17T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.404478 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.409251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.409305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.409322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.409344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.409361 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:14Z","lastTransitionTime":"2026-03-17T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.424725 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: E0317 00:24:14.424875 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.748486 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerStarted","Data":"41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44"} Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.751093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907"} Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.751167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74"} Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.764896 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.773952 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.787357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.801863 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.814135 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.825485 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.839515 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.849521 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.862196 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.879193 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.887640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.897266 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.923115 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.937153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.948204 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.957823 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.972694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:14 crc kubenswrapper[4755]: I0317 00:24:14.986822 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.003269 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.014205 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.025962 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.035224 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.048431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.062139 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.074350 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.085290 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.094254 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.109175 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.119060 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.129898 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.140341 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.156940 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.757747 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44" exitCode=0 Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.757805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44"} Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.777941 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.799596 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.817252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.827971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.843209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.853855 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.869967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.883694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.891921 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.900632 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.909329 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.919367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.927559 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.935541 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.942763 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:15 crc kubenswrapper[4755]: I0317 00:24:15.965052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.247687 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.247811 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:16 crc kubenswrapper[4755]: E0317 00:24:16.247936 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.247974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.248074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:16 crc kubenswrapper[4755]: E0317 00:24:16.248250 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:16 crc kubenswrapper[4755]: E0317 00:24:16.248576 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:16 crc kubenswrapper[4755]: E0317 00:24:16.248740 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.265733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.288969 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.298662 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.314583 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.326630 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: E0317 00:24:16.345396 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.346105 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.357300 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.370057 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.378367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.406965 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.424456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.440020 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.450406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.459474 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.473168 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.490646 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.763199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerStarted","Data":"19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99"} Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.780344 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.792138 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.806913 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.822811 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.851970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.870108 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.888694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.906905 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.925131 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.936499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.948970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.964644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.979959 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:16 crc kubenswrapper[4755]: I0317 00:24:16.993775 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.011053 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.023552 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.767421 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99" exitCode=0 Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.767499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99"} Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.781416 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.794107 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.803547 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.813612 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.824912 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.845020 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.857732 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.868705 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.878830 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.889021 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.897575 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.908931 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.917406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.928306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.935862 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:17 crc kubenswrapper[4755]: I0317 00:24:17.951185 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.248140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.248306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:18 crc kubenswrapper[4755]: E0317 00:24:18.248319 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.248343 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:18 crc kubenswrapper[4755]: E0317 00:24:18.248394 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.248402 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:18 crc kubenswrapper[4755]: E0317 00:24:18.248757 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:18 crc kubenswrapper[4755]: E0317 00:24:18.248466 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.248917 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:24:18 crc kubenswrapper[4755]: E0317 00:24:18.250021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.772390 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232" exitCode=0 Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.772460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232"} Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.785502 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.797673 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.811408 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.822481 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.848618 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.865529 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.883131 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.894602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.903570 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.916350 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.925596 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.938763 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.950339 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.960794 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.971958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:18 crc kubenswrapper[4755]: I0317 00:24:18.982270 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.778840 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460" exitCode=0 Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.778888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460"} Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.792841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.814130 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.833014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.847466 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.863337 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.873810 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.887324 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.900713 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.913616 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.922980 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.930061 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.939061 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.946847 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.953912 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.961658 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:19 crc kubenswrapper[4755]: I0317 00:24:19.977185 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.247875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.247985 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.248088 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.247924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:20 crc kubenswrapper[4755]: E0317 00:24:20.248246 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:20 crc kubenswrapper[4755]: E0317 00:24:20.248550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:20 crc kubenswrapper[4755]: E0317 00:24:20.248701 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:20 crc kubenswrapper[4755]: E0317 00:24:20.248794 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.785117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerStarted","Data":"0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b"} Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.798846 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.815496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.828420 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.841851 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.852259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.865951 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.878786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.891124 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.903887 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.919291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.928953 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.942871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.960039 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.973059 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:20 crc kubenswrapper[4755]: I0317 00:24:20.985291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.012226 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: E0317 00:24:21.347691 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.793222 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b" exitCode=0 Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.793267 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b"} Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.807637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.820424 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.830575 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.850119 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.864018 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.876554 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.890215 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.906892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.916617 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.932981 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.947664 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.960865 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.976624 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.986073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:21 crc kubenswrapper[4755]: I0317 00:24:21.996431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.006040 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.248773 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.248851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.248891 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:22 crc kubenswrapper[4755]: E0317 00:24:22.248966 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.249045 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:22 crc kubenswrapper[4755]: E0317 00:24:22.249193 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:22 crc kubenswrapper[4755]: E0317 00:24:22.249366 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:22 crc kubenswrapper[4755]: E0317 00:24:22.249518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.801865 4755 generic.go:334] "Generic (PLEG): container finished" podID="225816b5-e7fe-4d29-84ad-37187e904104" containerID="2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8" exitCode=0 Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.801920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerDied","Data":"2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8"} Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.820686 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.830789 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.845319 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.857914 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.887144 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.905824 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.919035 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.928804 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.937992 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:22 crc kubenswrapper[4755]: I0317 00:24:22.955104 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.010462 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.019501 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.029862 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.039067 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.047338 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.055118 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.807799 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" event={"ID":"c32c8c1b-db30-4059-97d0-ef753de5e7e0","Type":"ContainerStarted","Data":"6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f"} Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.807880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" event={"ID":"c32c8c1b-db30-4059-97d0-ef753de5e7e0","Type":"ContainerStarted","Data":"5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c"} Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.809978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fbf6becd860f7304bd32dff2545a8c5003ab2b12ecf27b79a023566c6cac0d5"} Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.817976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" event={"ID":"225816b5-e7fe-4d29-84ad-37187e904104","Type":"ContainerStarted","Data":"bc566b46275beabd86a34c464ec115a7c46ac980355af9923c0ce3d7a7b30570"} Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.826398 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.838606 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.847582 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.859038 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.869059 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.879310 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.887292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.902571 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.913366 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.925326 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.934014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.942335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.949792 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.960057 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.968801 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.978710 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.990567 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:23 crc kubenswrapper[4755]: I0317 00:24:23.999645 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.010158 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.018688 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.032684 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.044004 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.058895 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbf6becd860f7304bd32dff2545a8c5003ab2b12ecf27b79a023566c6cac0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.071751 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.080100 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.094567 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.112898 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc566b46275beabd86a34c464ec115a7c46ac980355af9923c0ce3d7a7b30570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.129166 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.139583 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.149318 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.165882 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.176603 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.247666 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.247733 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.248051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.248054 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.248169 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.248280 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.248423 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.248578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.648138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.648507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.648525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.648550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.648567 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:24Z","lastTransitionTime":"2026-03-17T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.666182 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.671857 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.671921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.671938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.671965 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.671981 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:24Z","lastTransitionTime":"2026-03-17T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.693146 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.698239 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.698306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.698330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.698358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.698382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:24Z","lastTransitionTime":"2026-03-17T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.714525 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.719037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.719134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.719153 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.719177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.719194 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:24Z","lastTransitionTime":"2026-03-17T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.730096 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.733416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.733466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.733477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.733495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.733508 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:24Z","lastTransitionTime":"2026-03-17T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.742896 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1691cfa1-2188-4028-9d19-13bfee907928\\\",\\\"systemUUID\\\":\\\"5f993bf0-a659-4d33-851e-45b2886560a8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: E0317 00:24:24.743044 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.823640 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hgfhx" event={"ID":"becc22f3-961c-4ce7-b97f-6d40e28c9373","Type":"ContainerStarted","Data":"cbbab504247269eea977bcd801dcd40c64b3f83c2ab4a70ec7e836b34c7e7518"} Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.824995 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6qtr" event={"ID":"de2167ca-ad7e-47ce-bf95-cebc396df145","Type":"ContainerStarted","Data":"ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193"} Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.843029 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.859697 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbf6becd860f7304bd32dff2545a8c5003ab2b12ecf27b79a023566c6cac0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.871266 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.882573 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.893617 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.907304 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc566b46275beabd86a34c464ec115a7c46ac980355af9923c0ce3d7a7b30570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.922223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.934246 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.945117 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.958280 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.967849 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbab504247269eea977bcd801dcd40c64b3f83c2ab4a70ec7e836b34c7e7518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.979165 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:24 crc kubenswrapper[4755]: I0317 00:24:24.991802 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.000614 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.013007 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.033413 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.045289 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.059511 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.070992 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.080679 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.091097 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbab504247269eea977bcd801dcd40c64b3f83c2ab4a70ec7e836b34c7e7518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.104223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.117673 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.126820 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.136456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.154726 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.166827 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.181496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbf6becd860f7304bd32dff2545a8c5003ab2b12ecf27b79a023566c6cac0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.190930 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.200885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.209253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.217554 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc566b46275beabd86a34c464ec115a7c46ac980355af9923c0ce3d7a7b30570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.829857 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7cncb" event={"ID":"def35a55-2212-4d8e-8040-69fdcc95e34c","Type":"ContainerStarted","Data":"50510abdc6861bdba37eb4b96d5784c3c0171c6e92ae4ba2843973506c930a94"} Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.831564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0877cf839564dabd8dcf5a7889f008922f8ac3da69f3875a931dff387527af31"} Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.831632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5c9aa9ece4895b8b587828ddd1dd32c944cbc2448d521100abbeccb2ac7a39c3"} Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.841824 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"221aa22e-24d2-40bf-8f35-eff4ded929df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a2bd1740ec33c208460149e12831eb8f9c548b51efbc6815ae993666a27407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82466c34a485d12f48aec22116d1906d7b482ad52bd75bb7b732ae6a05be3117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d644f6e95a9b7d33090c3b9754daea0bc31480d1342727a4a9e628064efcb55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e191713d463d1d1e962e5981907df8de5c956003f22af0843242b5b921a87499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.850971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.862357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hgfhx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"becc22f3-961c-4ce7-b97f-6d40e28c9373\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbbab504247269eea977bcd801dcd40c64b3f83c2ab4a70ec7e836b34c7e7518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hgfhx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.872210 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c32c8c1b-db30-4059-97d0-ef753de5e7e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5674bfc85dbfdf5463ab6133b41ad1b9cd018f7dab5ac24740f944853c960b6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c6827196a2d3fb479d55cb14f43a5603cce7db3f42c61a06561fe96dd55d90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bzw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mpdkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.884379 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2de863ac-0be1-45c8-9e03-56aa0fe9a23d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db4619c924ae823c9a384924e501c93d8af914f90cfab080d9a9898415897907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48586\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bhh2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.893284 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4v74b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7291e3d-2994-409e-972a-59394140b3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b2pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4v74b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.918225 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44d329be-573d-4143-97fb-d07ed343c898\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6cxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.934510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.948606 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94eb6cf8-35a8-49fc-acc6-92cab54f2710\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:22:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T00:23:56Z\\\",\\\"message\\\":\\\"le observer\\\\nW0317 00:23:55.910009 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0317 00:23:55.910238 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 00:23:55.911251 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3726039813/tls.crt::/tmp/serving-cert-3726039813/tls.key\\\\\\\"\\\\nI0317 00:23:56.361176 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0317 00:23:56.364650 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0317 00:23:56.364688 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0317 00:23:56.364724 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0317 00:23:56.364734 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0317 00:23:56.371482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0317 00:23:56.371529 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0317 00:23:56.371526 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0317 00:23:56.371541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0317 00:23:56.371552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0317 00:23:56.371560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0317 00:23:56.371567 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0317 00:23:56.371574 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0317 00:23:56.372957 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T00:23:55Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:22:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:22:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:22:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:22:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.960824 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbf6becd860f7304bd32dff2545a8c5003ab2b12ecf27b79a023566c6cac0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.972625 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.984182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7cncb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"def35a55-2212-4d8e-8040-69fdcc95e34c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50510abdc6861bdba37eb4b96d5784c3c0171c6e92ae4ba2843973506c930a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9w46z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7cncb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:25 crc kubenswrapper[4755]: I0317 00:24:25.999603 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-j6qtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de2167ca-ad7e-47ce-bf95-cebc396df145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgfzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-j6qtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.018603 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfp88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"225816b5-e7fe-4d29-84ad-37187e904104\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T00:24:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc566b46275beabd86a34c464ec115a7c46ac980355af9923c0ce3d7a7b30570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T00:24:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41abdfa7d451cc71f0b48dca6ff97ed10bc14d311939fd08055306c691573f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19837fab8541d21e5243fc2a376f410be555b89a4435e54ea7aac10d88466a99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef8317d203bddc2f4e5f240907325ed3fee5a094f85079ca90d1ce536f7c232\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d60f0b1fb9db671875c8e5a7df4c6ef258b11a51dc5412dcf98c1867bf3bd460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f3604d4ea9aa584f5950497afff02fa5c99c2ab00685ac1d3538c766351578b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e7c0fa45cdeafdec7b7d4f26dbd4dab51722186ca170d7661d841fa62633ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T00:24:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T00:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-76mdp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T00:23:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfp88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.034036 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T00:23:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.079684 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=17.07966611 podStartE2EDuration="17.07966611s" podCreationTimestamp="2026-03-17 00:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.079141922 +0000 UTC m=+140.838594205" watchObservedRunningTime="2026-03-17 00:24:26.07966611 +0000 UTC m=+140.839118393" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.127215 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hgfhx" podStartSLOduration=81.127195692 podStartE2EDuration="1m21.127195692s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.106870253 +0000 UTC m=+140.866322536" watchObservedRunningTime="2026-03-17 00:24:26.127195692 +0000 UTC m=+140.886647985" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.146043 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mpdkw" podStartSLOduration=80.146026438 podStartE2EDuration="1m20.146026438s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.145922324 +0000 UTC m=+140.905374607" watchObservedRunningTime="2026-03-17 00:24:26.146026438 +0000 UTC m=+140.905478741" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.176109 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podStartSLOduration=80.17608487 podStartE2EDuration="1m20.17608487s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.162733741 +0000 UTC m=+140.922186064" watchObservedRunningTime="2026-03-17 00:24:26.17608487 +0000 UTC m=+140.935537183" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.248176 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.248189 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.248283 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.248296 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:26 crc kubenswrapper[4755]: E0317 00:24:26.249276 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:26 crc kubenswrapper[4755]: E0317 00:24:26.249348 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:26 crc kubenswrapper[4755]: E0317 00:24:26.249398 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:26 crc kubenswrapper[4755]: E0317 00:24:26.249471 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.251191 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nfp88" podStartSLOduration=80.251164117 podStartE2EDuration="1m20.251164117s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.229011467 +0000 UTC m=+140.988463750" watchObservedRunningTime="2026-03-17 00:24:26.251164117 +0000 UTC m=+141.010616490" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.319023 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7cncb" podStartSLOduration=81.318992936 podStartE2EDuration="1m21.318992936s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.318858272 +0000 UTC m=+141.078310595" watchObservedRunningTime="2026-03-17 00:24:26.318992936 +0000 UTC m=+141.078445259" Mar 17 00:24:26 crc kubenswrapper[4755]: I0317 00:24:26.337693 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j6qtr" podStartSLOduration=80.337670727 podStartE2EDuration="1m20.337670727s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:26.337332686 +0000 UTC m=+141.096784969" watchObservedRunningTime="2026-03-17 00:24:26.337670727 +0000 UTC m=+141.097123040" Mar 17 00:24:26 crc kubenswrapper[4755]: E0317 00:24:26.348946 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:27 crc kubenswrapper[4755]: I0317 00:24:27.841717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"65d6593ee882ac58349e0037606dd1fe2ae2472621df081718b63e03fee5403f"} Mar 17 00:24:27 crc kubenswrapper[4755]: I0317 00:24:27.843340 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" exitCode=0 Mar 17 00:24:27 crc kubenswrapper[4755]: I0317 00:24:27.843372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.248701 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.248763 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:28 crc kubenswrapper[4755]: E0317 00:24:28.249009 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.249059 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.249134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:28 crc kubenswrapper[4755]: E0317 00:24:28.249232 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:28 crc kubenswrapper[4755]: E0317 00:24:28.249165 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:28 crc kubenswrapper[4755]: E0317 00:24:28.249374 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.270829 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.852641 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.852956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.852976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.852995 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.853012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:24:28 crc kubenswrapper[4755]: I0317 00:24:28.853031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.088214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.088485 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088532 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.088496645 +0000 UTC m=+176.847948938 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.088682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088718 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088757 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088782 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088832 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088867 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.088838737 +0000 UTC m=+176.848291070 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.088746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.088899 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.088884789 +0000 UTC m=+176.848337172 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.089037 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.089153 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.089128988 +0000 UTC m=+176.848581351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.189414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.189484 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189662 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189689 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189699 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189801 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189809 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs podName:f7291e3d-2994-409e-972a-59394140b3ad nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.189777622 +0000 UTC m=+176.949229935 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs") pod "network-metrics-daemon-4v74b" (UID: "f7291e3d-2994-409e-972a-59394140b3ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.189865 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 00:25:02.189844486 +0000 UTC m=+176.949296809 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.248008 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.248104 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.248113 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.248041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.248197 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.248365 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.248509 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:30 crc kubenswrapper[4755]: E0317 00:24:30.248651 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:30 crc kubenswrapper[4755]: I0317 00:24:30.863811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} Mar 17 00:24:31 crc kubenswrapper[4755]: I0317 00:24:31.248732 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:24:31 crc kubenswrapper[4755]: E0317 00:24:31.249015 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:24:31 crc kubenswrapper[4755]: E0317 00:24:31.350021 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:32 crc kubenswrapper[4755]: I0317 00:24:32.248105 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:32 crc kubenswrapper[4755]: I0317 00:24:32.248220 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:32 crc kubenswrapper[4755]: E0317 00:24:32.248321 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:32 crc kubenswrapper[4755]: E0317 00:24:32.248417 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:32 crc kubenswrapper[4755]: I0317 00:24:32.248650 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:32 crc kubenswrapper[4755]: E0317 00:24:32.248751 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:32 crc kubenswrapper[4755]: I0317 00:24:32.249036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:32 crc kubenswrapper[4755]: E0317 00:24:32.249232 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.879664 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerStarted","Data":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.880103 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.880153 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.922935 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podStartSLOduration=87.922874773 podStartE2EDuration="1m27.922874773s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:33.920998178 +0000 UTC m=+148.680450501" watchObservedRunningTime="2026-03-17 00:24:33.922874773 +0000 UTC m=+148.682327116" Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.936283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:33 crc kubenswrapper[4755]: I0317 00:24:33.947552 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=5.947531699 podStartE2EDuration="5.947531699s" podCreationTimestamp="2026-03-17 00:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:33.94698897 +0000 UTC m=+148.706441293" watchObservedRunningTime="2026-03-17 00:24:33.947531699 +0000 UTC m=+148.706984022" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.248245 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.248245 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.248317 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:34 crc kubenswrapper[4755]: E0317 00:24:34.248424 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.248505 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:34 crc kubenswrapper[4755]: E0317 00:24:34.248509 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:34 crc kubenswrapper[4755]: E0317 00:24:34.248607 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:34 crc kubenswrapper[4755]: E0317 00:24:34.248701 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.755888 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.755949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.755968 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.755993 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.756010 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T00:24:34Z","lastTransitionTime":"2026-03-17T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.810568 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p"] Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.810892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.812870 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.813918 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.813931 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.813985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.878577 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be6ac4d-072c-49f8-9f45-d24706f0e08f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.878729 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.878781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.878830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be6ac4d-072c-49f8-9f45-d24706f0e08f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.878890 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0be6ac4d-072c-49f8-9f45-d24706f0e08f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.883417 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.910362 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be6ac4d-072c-49f8-9f45-d24706f0e08f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980566 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0be6ac4d-072c-49f8-9f45-d24706f0e08f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be6ac4d-072c-49f8-9f45-d24706f0e08f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.980503 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0be6ac4d-072c-49f8-9f45-d24706f0e08f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.982227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0be6ac4d-072c-49f8-9f45-d24706f0e08f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:34 crc kubenswrapper[4755]: I0317 00:24:34.991629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be6ac4d-072c-49f8-9f45-d24706f0e08f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.000406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be6ac4d-072c-49f8-9f45-d24706f0e08f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9bw2p\" (UID: \"0be6ac4d-072c-49f8-9f45-d24706f0e08f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.126839 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" Mar 17 00:24:35 crc kubenswrapper[4755]: W0317 00:24:35.148042 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be6ac4d_072c_49f8_9f45_d24706f0e08f.slice/crio-77288d1c41435a353fbf26c5210f41613a315d31b97a78610295c68811e9f71b WatchSource:0}: Error finding container 77288d1c41435a353fbf26c5210f41613a315d31b97a78610295c68811e9f71b: Status 404 returned error can't find the container with id 77288d1c41435a353fbf26c5210f41613a315d31b97a78610295c68811e9f71b Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.260833 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.269696 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.887614 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" event={"ID":"0be6ac4d-072c-49f8-9f45-d24706f0e08f","Type":"ContainerStarted","Data":"af871ca25ee84aedd02bddb3818afbcaf0472c08b00473fcc6e711227cc9622b"} Mar 17 00:24:35 crc kubenswrapper[4755]: I0317 00:24:35.887676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" event={"ID":"0be6ac4d-072c-49f8-9f45-d24706f0e08f","Type":"ContainerStarted","Data":"77288d1c41435a353fbf26c5210f41613a315d31b97a78610295c68811e9f71b"} Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.247315 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.247496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.247492 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.249790 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.249783 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.249900 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.249961 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.250028 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.350830 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.477254 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9bw2p" podStartSLOduration=90.477227775 podStartE2EDuration="1m30.477227775s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:35.921284009 +0000 UTC m=+150.680736312" watchObservedRunningTime="2026-03-17 00:24:36.477227775 +0000 UTC m=+151.236680078" Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.477465 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4v74b"] Mar 17 00:24:36 crc kubenswrapper[4755]: I0317 00:24:36.890617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:36 crc kubenswrapper[4755]: E0317 00:24:36.891222 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:38 crc kubenswrapper[4755]: I0317 00:24:38.248146 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:38 crc kubenswrapper[4755]: E0317 00:24:38.248575 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:38 crc kubenswrapper[4755]: I0317 00:24:38.248270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:38 crc kubenswrapper[4755]: I0317 00:24:38.248148 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:38 crc kubenswrapper[4755]: E0317 00:24:38.248664 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:38 crc kubenswrapper[4755]: E0317 00:24:38.248794 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:39 crc kubenswrapper[4755]: I0317 00:24:39.247205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:39 crc kubenswrapper[4755]: E0317 00:24:39.247419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:40 crc kubenswrapper[4755]: I0317 00:24:40.248187 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:40 crc kubenswrapper[4755]: I0317 00:24:40.248332 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:40 crc kubenswrapper[4755]: E0317 00:24:40.248484 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 00:24:40 crc kubenswrapper[4755]: I0317 00:24:40.248529 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:40 crc kubenswrapper[4755]: E0317 00:24:40.248722 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 00:24:40 crc kubenswrapper[4755]: E0317 00:24:40.248845 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 00:24:41 crc kubenswrapper[4755]: I0317 00:24:41.247859 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:41 crc kubenswrapper[4755]: E0317 00:24:41.248037 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4v74b" podUID="f7291e3d-2994-409e-972a-59394140b3ad" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.247908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.248002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.247918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.249850 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.250373 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.251261 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 00:24:42 crc kubenswrapper[4755]: I0317 00:24:42.252096 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 00:24:43 crc kubenswrapper[4755]: I0317 00:24:43.247592 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:24:43 crc kubenswrapper[4755]: I0317 00:24:43.248657 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:24:43 crc kubenswrapper[4755]: E0317 00:24:43.248980 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:24:43 crc kubenswrapper[4755]: I0317 00:24:43.250525 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 00:24:43 crc kubenswrapper[4755]: I0317 00:24:43.250532 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.156613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.251874 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc64n"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.253062 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.267235 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8rqm6"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.267519 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.267600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.267659 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.267690 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.268182 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.268373 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.268654 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.268767 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.270205 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.270581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.273036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.272862 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.275371 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.275995 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.276556 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.277378 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.277709 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.277889 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.279711 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.280808 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.281202 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.281305 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.281921 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.283996 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.284282 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298220 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298648 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298845 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298882 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.298959 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299071 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299095 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299191 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299269 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299308 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299469 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299554 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299583 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299664 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299708 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.299814 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.301823 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8rqm6"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.302952 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.303917 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.305488 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.310047 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.310495 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.310671 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.310888 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.311184 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.311222 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.311394 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.311931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.311982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-client\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312004 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-encryption-config\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312030 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-policies\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312049 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-dir\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312175 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqtp\" (UniqueName: \"kubernetes.io/projected/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-kube-api-access-sbqtp\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312261 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-config\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrrzk\" (UniqueName: \"kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e12abed-0865-4f85-b563-ff72e5a05722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312352 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312374 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r78k2\" (UniqueName: \"kubernetes.io/projected/9e12abed-0865-4f85-b563-ff72e5a05722-kube-api-access-r78k2\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtxs\" (UniqueName: \"kubernetes.io/projected/d71eafe9-d84b-4a1c-a976-839e5d80bad4-kube-api-access-gwtxs\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312528 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312551 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-client\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312574 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmg6\" (UniqueName: \"kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312595 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-image-import-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-encryption-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-images\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-node-pullsecrets\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-serving-cert\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit-dir\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312867 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlr5s\" (UniqueName: \"kubernetes.io/projected/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-kube-api-access-tlr5s\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.312967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.313010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-serving-cert\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.313048 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.327368 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.327236 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.328187 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.327693 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.330385 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.330981 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.331678 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.349967 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.352188 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vn7g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.352597 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-grcxd"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.352873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.352979 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.353521 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgc8x"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.353996 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.354339 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.354983 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8gjxd"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.355421 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.355782 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.355892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.363100 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.363638 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.363685 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364333 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364402 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364493 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364570 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364800 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.363647 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.364989 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.365039 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.365152 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.366278 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.365003 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.366506 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.366530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.366719 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.367378 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.368128 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.368338 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.368570 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.368771 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.368882 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369189 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369221 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369398 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369539 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369551 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369563 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.369579 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.370476 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.370897 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.371046 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.371122 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.371468 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.371540 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p4wgk"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.372243 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.373129 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.375673 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.376308 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.376727 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.377038 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.377316 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.377494 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.378275 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-28bml"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.378644 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.399494 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.418985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.419253 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.419431 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.419657 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.419948 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.420579 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.420674 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.421379 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.421530 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29561760-zz9vv"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.421819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.421891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-images\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.422203 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.422922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.422968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-images\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-node-pullsecrets\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423178 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-node-pullsecrets\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423272 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423358 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-serving-cert\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.423464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit-dir\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424103 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit-dir\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlr5s\" (UniqueName: \"kubernetes.io/projected/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-kube-api-access-tlr5s\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424234 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424485 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425168 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-audit\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424824 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425377 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424901 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425399 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-serving-cert\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425473 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-client\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425490 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-encryption-config\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.424964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-policies\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425531 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-dir\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqtp\" (UniqueName: \"kubernetes.io/projected/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-kube-api-access-sbqtp\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425629 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425648 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-config\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425669 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e12abed-0865-4f85-b563-ff72e5a05722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrrzk\" (UniqueName: \"kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425716 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r78k2\" (UniqueName: \"kubernetes.io/projected/9e12abed-0865-4f85-b563-ff72e5a05722-kube-api-access-r78k2\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425772 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtxs\" (UniqueName: \"kubernetes.io/projected/d71eafe9-d84b-4a1c-a976-839e5d80bad4-kube-api-access-gwtxs\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425806 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-client\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmg6\" (UniqueName: \"kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425857 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-image-import-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-encryption-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.426056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-dir\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.426512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425041 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.426624 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.426921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-audit-policies\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.427073 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.427158 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.427662 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.427752 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.428372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.429041 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e12abed-0865-4f85-b563-ff72e5a05722-config\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.429552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-encryption-config\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.429596 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-serving-cert\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.429664 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425071 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425583 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.425640 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.430085 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.430144 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7fxgg"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.430466 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.430575 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71eafe9-d84b-4a1c-a976-839e5d80bad4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.430661 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.431199 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.431670 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.431853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.432135 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.432306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.432329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.432803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-image-import-ca\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.433537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.433722 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.434660 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gc6g2"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.434880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e12abed-0865-4f85-b563-ff72e5a05722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.435186 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.435650 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.435860 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.436242 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.436729 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.437036 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.437500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.437966 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561784-zznp2"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.438376 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.439419 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.439639 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.440281 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.440896 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.441287 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-serving-cert\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.441983 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.442556 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.442578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-encryption-config\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.442781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-etcd-client\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.442846 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.442898 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d71eafe9-d84b-4a1c-a976-839e5d80bad4-etcd-client\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.443149 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g4dtp"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.443431 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.443651 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.446104 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.446395 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.447151 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-flgvl"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.447570 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.447916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.448083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.449821 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.452900 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.456128 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.462840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc64n"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.462876 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.462944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.462978 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.464071 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgc8x"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.465231 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gc6g2"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.470042 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8gjxd"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.473308 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.473351 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vn7g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.473592 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.475018 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.477183 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561784-zznp2"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.478576 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-28bml"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.481602 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.482498 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.484094 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.486003 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.486906 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.488548 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.489373 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.490522 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.491588 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w595z"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.492245 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.493045 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.493720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.494740 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p4wgk"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.495640 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.495885 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.496830 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.498454 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-574nd"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.498954 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.501220 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29561760-zz9vv"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.502303 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-grcxd"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.503259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.504187 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.505179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.506067 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4dtp"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.506989 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-flgvl"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.508854 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w595z"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.509946 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.510775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.511694 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.512903 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.513983 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d2qpt"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.515262 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x466w"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.515409 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.515768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.516158 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d2qpt"] Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.522919 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08e826b2-3275-49e6-b833-5494037aac5b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526426 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526481 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqndp\" (UniqueName: \"kubernetes.io/projected/64e28e17-1dd0-401e-9b26-7eefc1c54f5f-kube-api-access-nqndp\") pod \"downloads-7954f5f757-8gjxd\" (UID: \"64e28e17-1dd0-401e-9b26-7eefc1c54f5f\") " pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526506 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnjcd\" (UniqueName: \"kubernetes.io/projected/18513cd3-96dd-41ef-94ba-9f4a629015fb-kube-api-access-nnjcd\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526563 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d755c0ff-4b40-451e-9fb4-4807c6297aaa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d755c0ff-4b40-451e-9fb4-4807c6297aaa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j594d\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-kube-api-access-j594d\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-config\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526660 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c0bace3-bc39-44fa-9f39-d8f18c07675d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526680 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39a0363-2d49-4e67-ac77-a381064d06a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxgq\" (UniqueName: \"kubernetes.io/projected/54a2312d-9a82-4642-aee1-83e701a54908-kube-api-access-dpxgq\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.526906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-config\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527221 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6173ecdb-39f4-4772-9dbd-5fa6e2908971-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527264 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64nd\" (UniqueName: \"kubernetes.io/projected/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-kube-api-access-m64nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527288 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-config\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jcp6\" (UniqueName: \"kubernetes.io/projected/0771adb9-eee6-469f-b734-7d28e8a50d41-kube-api-access-4jcp6\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-default-certificate\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/54a2312d-9a82-4642-aee1-83e701a54908-machine-approver-tls\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zrx8\" (UniqueName: \"kubernetes.io/projected/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-kube-api-access-2zrx8\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527445 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-auth-proxy-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527553 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls958\" (UniqueName: \"kubernetes.io/projected/689a5612-54fa-44b9-8b91-1b03d916a159-kube-api-access-ls958\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527590 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527604 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvlsg\" (UniqueName: \"kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg\") pod \"auto-csr-approver-29561784-zznp2\" (UID: \"d46784e1-420c-4d3b-aca7-65271a898c44\") " pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527658 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6bb8\" (UniqueName: \"kubernetes.io/projected/659184e0-0620-4930-8faa-22e586cf403a-kube-api-access-c6bb8\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527673 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jlr\" (UniqueName: \"kubernetes.io/projected/44c47b93-07a7-414f-9548-ea1dd505c452-kube-api-access-c9jlr\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-client\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67sr\" (UniqueName: \"kubernetes.io/projected/b2ec378f-6a5f-48cd-ad14-1445d110a829-kube-api-access-w67sr\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ec378f-6a5f-48cd-ad14-1445d110a829-serving-cert\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527815 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4fwn\" (UniqueName: \"kubernetes.io/projected/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-kube-api-access-s4fwn\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69234b24-fdb7-40bf-828b-104d8d43891c-serving-cert\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527959 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.527984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39a0363-2d49-4e67-ac77-a381064d06a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528144 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528205 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqs4z\" (UniqueName: \"kubernetes.io/projected/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-kube-api-access-jqs4z\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528221 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87c9n\" (UniqueName: \"kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528257 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-stats-auth\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0bace3-bc39-44fa-9f39-d8f18c07675d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528310 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnb4f\" (UniqueName: \"kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf64e3f-992b-4f75-9762-839e4a23633a-config\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5jrp\" (UniqueName: \"kubernetes.io/projected/998d29bf-29f2-40ff-abd0-6730667f11f6-kube-api-access-j5jrp\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528447 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-service-ca-bundle\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6173ecdb-39f4-4772-9dbd-5fa6e2908971-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689a5612-54fa-44b9-8b91-1b03d916a159-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528519 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/998d29bf-29f2-40ff-abd0-6730667f11f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54f5\" (UniqueName: \"kubernetes.io/projected/13d9a24a-129b-40b0-9fa1-1bd7d595f109-kube-api-access-z54f5\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528556 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528571 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf64e3f-992b-4f75-9762-839e4a23633a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8f55\" (UniqueName: \"kubernetes.io/projected/4474997a-d442-4f8a-b50e-c6ecab9393f5-kube-api-access-f8f55\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528607 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8mvb\" (UniqueName: \"kubernetes.io/projected/69234b24-fdb7-40bf-828b-104d8d43891c-kube-api-access-g8mvb\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528625 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528692 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-metrics-certs\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-serving-cert\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-trusted-ca\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6173ecdb-39f4-4772-9dbd-5fa6e2908971-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwtq\" (UniqueName: \"kubernetes.io/projected/1c0bace3-bc39-44fa-9f39-d8f18c07675d-kube-api-access-pmwtq\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.528993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529019 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdmd\" (UniqueName: \"kubernetes.io/projected/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-kube-api-access-gmdmd\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529139 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/998d29bf-29f2-40ff-abd0-6730667f11f6-proxy-tls\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/659184e0-0620-4930-8faa-22e586cf403a-metrics-tls\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-service-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh76n\" (UniqueName: \"kubernetes.io/projected/08e826b2-3275-49e6-b833-5494037aac5b-kube-api-access-fh76n\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529554 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-config\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529656 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h684j\" (UniqueName: \"kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529679 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4cf64e3f-992b-4f75-9762-839e4a23633a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529704 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7jt\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-kube-api-access-pf7jt\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529801 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689a5612-54fa-44b9-8b91-1b03d916a159-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529849 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.529869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.535885 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.555171 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.575817 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.595243 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.615914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689a5612-54fa-44b9-8b91-1b03d916a159-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08e826b2-3275-49e6-b833-5494037aac5b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630833 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630853 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqndp\" (UniqueName: \"kubernetes.io/projected/64e28e17-1dd0-401e-9b26-7eefc1c54f5f-kube-api-access-nqndp\") pod \"downloads-7954f5f757-8gjxd\" (UID: \"64e28e17-1dd0-401e-9b26-7eefc1c54f5f\") " pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnjcd\" (UniqueName: \"kubernetes.io/projected/18513cd3-96dd-41ef-94ba-9f4a629015fb-kube-api-access-nnjcd\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d755c0ff-4b40-451e-9fb4-4807c6297aaa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d755c0ff-4b40-451e-9fb4-4807c6297aaa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxk4\" (UniqueName: \"kubernetes.io/projected/886294ef-7db0-41dd-854e-76ef9db63e9f-kube-api-access-qdxk4\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630966 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j594d\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-kube-api-access-j594d\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.630989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c0bace3-bc39-44fa-9f39-d8f18c07675d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631012 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-config\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39a0363-2d49-4e67-ac77-a381064d06a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631074 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxgq\" (UniqueName: \"kubernetes.io/projected/54a2312d-9a82-4642-aee1-83e701a54908-kube-api-access-dpxgq\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631122 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-config\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631155 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631200 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6173ecdb-39f4-4772-9dbd-5fa6e2908971-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m64nd\" (UniqueName: \"kubernetes.io/projected/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-kube-api-access-m64nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-config\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631345 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jcp6\" (UniqueName: \"kubernetes.io/projected/0771adb9-eee6-469f-b734-7d28e8a50d41-kube-api-access-4jcp6\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-default-certificate\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/54a2312d-9a82-4642-aee1-83e701a54908-machine-approver-tls\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zrx8\" (UniqueName: \"kubernetes.io/projected/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-kube-api-access-2zrx8\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631473 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1c0bace3-bc39-44fa-9f39-d8f18c07675d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.631499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.632241 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-config\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.632316 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.632431 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.632688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-config\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.632723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-config\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-auth-proxy-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633342 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfmk\" (UniqueName: \"kubernetes.io/projected/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-kube-api-access-6wfmk\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633403 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls958\" (UniqueName: \"kubernetes.io/projected/689a5612-54fa-44b9-8b91-1b03d916a159-kube-api-access-ls958\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633428 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633469 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvlsg\" (UniqueName: \"kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg\") pod \"auto-csr-approver-29561784-zznp2\" (UID: \"d46784e1-420c-4d3b-aca7-65271a898c44\") " pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/886294ef-7db0-41dd-854e-76ef9db63e9f-tmpfs\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633536 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx7c4\" (UniqueName: \"kubernetes.io/projected/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-kube-api-access-xx7c4\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-auth-proxy-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6bb8\" (UniqueName: \"kubernetes.io/projected/659184e0-0620-4930-8faa-22e586cf403a-kube-api-access-c6bb8\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jlr\" (UniqueName: \"kubernetes.io/projected/44c47b93-07a7-414f-9548-ea1dd505c452-kube-api-access-c9jlr\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633635 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633664 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-client\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633716 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67sr\" (UniqueName: \"kubernetes.io/projected/b2ec378f-6a5f-48cd-ad14-1445d110a829-kube-api-access-w67sr\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633768 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633812 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ec378f-6a5f-48cd-ad14-1445d110a829-serving-cert\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633835 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4fwn\" (UniqueName: \"kubernetes.io/projected/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-kube-api-access-s4fwn\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69234b24-fdb7-40bf-828b-104d8d43891c-serving-cert\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633912 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633936 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39a0363-2d49-4e67-ac77-a381064d06a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.633987 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634059 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqs4z\" (UniqueName: \"kubernetes.io/projected/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-kube-api-access-jqs4z\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87c9n\" (UniqueName: \"kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-stats-auth\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0bace3-bc39-44fa-9f39-d8f18c07675d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnv7q\" (UniqueName: \"kubernetes.io/projected/1347078a-8c1c-4e33-aee1-8aaad00829f2-kube-api-access-qnv7q\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnb4f\" (UniqueName: \"kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf64e3f-992b-4f75-9762-839e4a23633a-config\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-plugins-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634402 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jrp\" (UniqueName: \"kubernetes.io/projected/998d29bf-29f2-40ff-abd0-6730667f11f6-kube-api-access-j5jrp\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634425 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634479 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-service-ca-bundle\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634482 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8w7m\" (UniqueName: \"kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634560 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6173ecdb-39f4-4772-9dbd-5fa6e2908971-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634583 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634606 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689a5612-54fa-44b9-8b91-1b03d916a159-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54f5\" (UniqueName: \"kubernetes.io/projected/13d9a24a-129b-40b0-9fa1-1bd7d595f109-kube-api-access-z54f5\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634656 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/998d29bf-29f2-40ff-abd0-6730667f11f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634680 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf64e3f-992b-4f75-9762-839e4a23633a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8f55\" (UniqueName: \"kubernetes.io/projected/4474997a-d442-4f8a-b50e-c6ecab9393f5-kube-api-access-f8f55\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634748 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8mvb\" (UniqueName: \"kubernetes.io/projected/69234b24-fdb7-40bf-828b-104d8d43891c-kube-api-access-g8mvb\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634773 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634798 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56j24\" (UniqueName: \"kubernetes.io/projected/64301027-d188-47a4-a4a3-72b00d2d100a-kube-api-access-56j24\") pod \"migrator-59844c95c7-ktkt5\" (UID: \"64301027-d188-47a4-a4a3-72b00d2d100a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-csi-data-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-metrics-certs\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qm8w\" (UniqueName: \"kubernetes.io/projected/2bc15dc4-de67-4b17-8f95-9d8772166b35-kube-api-access-4qm8w\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634987 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-serving-cert\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-trusted-ca\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635034 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6173ecdb-39f4-4772-9dbd-5fa6e2908971-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwtq\" (UniqueName: \"kubernetes.io/projected/1c0bace3-bc39-44fa-9f39-d8f18c07675d-kube-api-access-pmwtq\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635127 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-socket-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhdw\" (UniqueName: \"kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-registration-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635300 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdmd\" (UniqueName: \"kubernetes.io/projected/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-kube-api-access-gmdmd\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635399 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/998d29bf-29f2-40ff-abd0-6730667f11f6-proxy-tls\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635498 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-key\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/659184e0-0620-4930-8faa-22e586cf403a-metrics-tls\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635570 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-webhook-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635593 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-service-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635616 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh76n\" (UniqueName: \"kubernetes.io/projected/08e826b2-3275-49e6-b833-5494037aac5b-kube-api-access-fh76n\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-apiservice-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635738 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-config\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/54a2312d-9a82-4642-aee1-83e701a54908-machine-approver-tls\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635814 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h684j\" (UniqueName: \"kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4cf64e3f-992b-4f75-9762-839e4a23633a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635863 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7jt\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-kube-api-access-pf7jt\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.635955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-mountpoint-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.636148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.636308 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.636573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.636797 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689a5612-54fa-44b9-8b91-1b03d916a159-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.637266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.637398 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2ec378f-6a5f-48cd-ad14-1445d110a829-trusted-ca\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.637715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/998d29bf-29f2-40ff-abd0-6730667f11f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.637978 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.638098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.638175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.638353 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69234b24-fdb7-40bf-828b-104d8d43891c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.638626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69234b24-fdb7-40bf-828b-104d8d43891c-serving-cert\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.638818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.639373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ec378f-6a5f-48cd-ad14-1445d110a829-serving-cert\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.639475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.634228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.639523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/689a5612-54fa-44b9-8b91-1b03d916a159-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.640017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.640113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/998d29bf-29f2-40ff-abd0-6730667f11f6-proxy-tls\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.641231 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.641478 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.642422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.642678 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39a0363-2d49-4e67-ac77-a381064d06a0-metrics-tls\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.642810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0bace3-bc39-44fa-9f39-d8f18c07675d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.642906 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.643125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/659184e0-0620-4930-8faa-22e586cf403a-metrics-tls\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.643340 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.645208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.646002 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.647050 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a39a0363-2d49-4e67-ac77-a381064d06a0-trusted-ca\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.647603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08e826b2-3275-49e6-b833-5494037aac5b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.648813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a2312d-9a82-4642-aee1-83e701a54908-config\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.657703 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.675290 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.695382 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.699228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-service-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.715304 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.735510 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736837 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736917 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfmk\" (UniqueName: \"kubernetes.io/projected/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-kube-api-access-6wfmk\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/886294ef-7db0-41dd-854e-76ef9db63e9f-tmpfs\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.736979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx7c4\" (UniqueName: \"kubernetes.io/projected/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-kube-api-access-xx7c4\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnv7q\" (UniqueName: \"kubernetes.io/projected/1347078a-8c1c-4e33-aee1-8aaad00829f2-kube-api-access-qnv7q\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737295 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-plugins-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8w7m\" (UniqueName: \"kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737376 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/886294ef-7db0-41dd-854e-76ef9db63e9f-tmpfs\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56j24\" (UniqueName: \"kubernetes.io/projected/64301027-d188-47a4-a4a3-72b00d2d100a-kube-api-access-56j24\") pod \"migrator-59844c95c7-ktkt5\" (UID: \"64301027-d188-47a4-a4a3-72b00d2d100a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-csi-data-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-plugins-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-csi-data-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qm8w\" (UniqueName: \"kubernetes.io/projected/2bc15dc4-de67-4b17-8f95-9d8772166b35-kube-api-access-4qm8w\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737695 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-socket-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhdw\" (UniqueName: \"kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-registration-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-key\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737817 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-socket-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737877 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-webhook-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737890 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-registration-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.737918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-apiservice-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.738045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-mountpoint-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.738153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxk4\" (UniqueName: \"kubernetes.io/projected/886294ef-7db0-41dd-854e-76ef9db63e9f-kube-api-access-qdxk4\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.738169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.738224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1347078a-8c1c-4e33-aee1-8aaad00829f2-mountpoint-dir\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.739725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-serving-cert\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.760425 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.768592 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-client\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.775239 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.794851 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.814824 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.820205 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-config\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.835772 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.836447 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-etcd-ca\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.856691 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.875463 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.903012 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.913902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d755c0ff-4b40-451e-9fb4-4807c6297aaa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.921203 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.928869 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.935622 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.955973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.976281 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.983569 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf64e3f-992b-4f75-9762-839e4a23633a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:45 crc kubenswrapper[4755]: I0317 00:24:45.995929 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.000742 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf64e3f-992b-4f75-9762-839e4a23633a-config\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.015736 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.035019 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.056490 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.063025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6173ecdb-39f4-4772-9dbd-5fa6e2908971-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.075886 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.083030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6173ecdb-39f4-4772-9dbd-5fa6e2908971-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.095064 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.106471 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d755c0ff-4b40-451e-9fb4-4807c6297aaa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.115536 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.156154 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.163314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlr5s\" (UniqueName: \"kubernetes.io/projected/cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a-kube-api-access-tlr5s\") pod \"openshift-apiserver-operator-796bbdcf4f-bnrkl\" (UID: \"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.175975 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.196547 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.263417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrrzk\" (UniqueName: \"kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk\") pod \"controller-manager-879f6c89f-6bvf8\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.277056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.285258 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqtp\" (UniqueName: \"kubernetes.io/projected/4faedbc4-02bf-41be-9eaa-2974c1a6b8d3-kube-api-access-sbqtp\") pod \"apiserver-76f77b778f-lc64n\" (UID: \"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3\") " pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.294158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtxs\" (UniqueName: \"kubernetes.io/projected/d71eafe9-d84b-4a1c-a976-839e5d80bad4-kube-api-access-gwtxs\") pod \"apiserver-7bbb656c7d-fnm9k\" (UID: \"d71eafe9-d84b-4a1c-a976-839e5d80bad4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.316925 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.322615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmg6\" (UniqueName: \"kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6\") pod \"route-controller-manager-6576b87f9c-tjxpj\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.324607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.336064 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.355096 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.362796 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.375683 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.383907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.396962 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.402257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.422497 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.429119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.433789 4755 request.go:700] Waited for 1.001208323s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.462261 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.476314 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.479112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-service-ca-bundle\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.479478 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r78k2\" (UniqueName: \"kubernetes.io/projected/9e12abed-0865-4f85-b563-ff72e5a05722-kube-api-access-r78k2\") pod \"machine-api-operator-5694c8668f-8rqm6\" (UID: \"9e12abed-0865-4f85-b563-ff72e5a05722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.497018 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.511366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-metrics-certs\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.512174 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.515567 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.535884 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.548088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-default-certificate\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.555470 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.560346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.569084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-stats-auth\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.575757 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.590044 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.592207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl"] Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.596462 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: W0317 00:24:46.604575 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca30bd23_64c8_4399_bb0d_cbe5a92a2b5b.slice/crio-4b162a7529418ecd58d450f0972f348088429cfb5cc285f1a82ab8b4813349bf WatchSource:0}: Error finding container 4b162a7529418ecd58d450f0972f348088429cfb5cc285f1a82ab8b4813349bf: Status 404 returned error can't find the container with id 4b162a7529418ecd58d450f0972f348088429cfb5cc285f1a82ab8b4813349bf Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.615953 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.629697 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.632592 4755 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.632692 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs podName:18513cd3-96dd-41ef-94ba-9f4a629015fb nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.132645367 +0000 UTC m=+161.892097650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs") pod "machine-config-server-x466w" (UID: "18513cd3-96dd-41ef-94ba-9f4a629015fb") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.632761 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.632798 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert podName:4474997a-d442-4f8a-b50e-c6ecab9393f5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.132788862 +0000 UTC m=+161.892241145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert") pod "olm-operator-6b444d44fb-jcw6v" (UID: "4474997a-d442-4f8a-b50e-c6ecab9393f5") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.632835 4755 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.634474 4755 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635227 4755 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.635399 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635503 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert podName:44c47b93-07a7-414f-9548-ea1dd505c452 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.132912506 +0000 UTC m=+161.892364869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert") pod "ingress-canary-g4dtp" (UID: "44c47b93-07a7-414f-9548-ea1dd505c452") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635533 4755 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635561 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert podName:65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.135530106 +0000 UTC m=+161.894982389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-cx96g" (UID: "65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635577 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token podName:18513cd3-96dd-41ef-94ba-9f4a629015fb nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.135570977 +0000 UTC m=+161.895023260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token") pod "machine-config-server-x466w" (UID: "18513cd3-96dd-41ef-94ba-9f4a629015fb") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635584 4755 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635592 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs podName:0771adb9-eee6-469f-b734-7d28e8a50d41 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.135587148 +0000 UTC m=+161.895039431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs") pod "multus-admission-controller-857f4d67dd-flgvl" (UID: "0771adb9-eee6-469f-b734-7d28e8a50d41") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.635608 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume podName:13d9a24a-129b-40b0-9fa1-1bd7d595f109 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.135598488 +0000 UTC m=+161.895050771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume") pod "dns-default-w595z" (UID: "13d9a24a-129b-40b0-9fa1-1bd7d595f109") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.636464 4755 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.636505 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist podName:183616f4-5ea5-4c31-a465-edb5b837ca8f nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.136495279 +0000 UTC m=+161.895947562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-574nd" (UID: "183616f4-5ea5-4c31-a465-edb5b837ca8f") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.638338 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.638367 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert podName:4474997a-d442-4f8a-b50e-c6ecab9393f5 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.138358863 +0000 UTC m=+161.897811146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert") pod "olm-operator-6b444d44fb-jcw6v" (UID: "4474997a-d442-4f8a-b50e-c6ecab9393f5") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.639788 4755 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.639820 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls podName:13d9a24a-129b-40b0-9fa1-1bd7d595f109 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.139811303 +0000 UTC m=+161.899263586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls") pod "dns-default-w595z" (UID: "13d9a24a-129b-40b0-9fa1-1bd7d595f109") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.640804 4755 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.640877 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert podName:1f07cb15-0d7a-4412-a5e4-5489e61bb8ef nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.140856759 +0000 UTC m=+161.900309042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert") pod "service-ca-operator-777779d784-z4wp4" (UID: "1f07cb15-0d7a-4412-a5e4-5489e61bb8ef") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.641972 4755 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.642006 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config podName:65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.141997979 +0000 UTC m=+161.901450262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config") pod "kube-storage-version-migrator-operator-b67b599dd-cx96g" (UID: "65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.642186 4755 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.642208 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config podName:1f07cb15-0d7a-4412-a5e4-5489e61bb8ef nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.142201835 +0000 UTC m=+161.901654118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config") pod "service-ca-operator-777779d784-z4wp4" (UID: "1f07cb15-0d7a-4412-a5e4-5489e61bb8ef") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.659034 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.666955 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k"] Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.675377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.696304 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.701883 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-apiservice-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.703010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/886294ef-7db0-41dd-854e-76ef9db63e9f-webhook-cert\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.715629 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.718106 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc64n"] Mar 17 00:24:46 crc kubenswrapper[4755]: W0317 00:24:46.723790 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4faedbc4_02bf_41be_9eaa_2974c1a6b8d3.slice/crio-6be78523a547af282ca471f58b4eaf63ae0ddcc1662b81c876255a18714637ad WatchSource:0}: Error finding container 6be78523a547af282ca471f58b4eaf63ae0ddcc1662b81c876255a18714637ad: Status 404 returned error can't find the container with id 6be78523a547af282ca471f58b4eaf63ae0ddcc1662b81c876255a18714637ad Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.735104 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737017 4755 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737084 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images podName:2bc15dc4-de67-4b17-8f95-9d8772166b35 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.237067072 +0000 UTC m=+161.996519345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images") pod "machine-config-operator-74547568cd-vt4qh" (UID: "2bc15dc4-de67-4b17-8f95-9d8772166b35") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737183 4755 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737249 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls podName:2bc15dc4-de67-4b17-8f95-9d8772166b35 nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.237228368 +0000 UTC m=+161.996680731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls") pod "machine-config-operator-74547568cd-vt4qh" (UID: "2bc15dc4-de67-4b17-8f95-9d8772166b35") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737261 4755 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.737287 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume podName:cfcd13b3-4ede-42eb-8b04-b2d572f7f64c nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.237280209 +0000 UTC m=+161.996732493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume") pod "collect-profiles-29561775-fpcjj" (UID: "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738582 4755 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738708 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738738 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle podName:f0a4ebae-87f9-4329-bcd7-62cc2e3898fd nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.238727089 +0000 UTC m=+161.998179372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle") pod "service-ca-9c57cc56f-gc6g2" (UID: "f0a4ebae-87f9-4329-bcd7-62cc2e3898fd") : failed to sync configmap cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738755 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume podName:cfcd13b3-4ede-42eb-8b04-b2d572f7f64c nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.23874731 +0000 UTC m=+161.998199593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume") pod "collect-profiles-29561775-fpcjj" (UID: "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738784 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: E0317 00:24:46.738808 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert podName:d48d9cab-c4a2-4727-ad9e-61ae13fe0dff nodeName:}" failed. No retries permitted until 2026-03-17 00:24:47.238801412 +0000 UTC m=+161.998253695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-m6qr7" (UID: "d48d9cab-c4a2-4727-ad9e-61ae13fe0dff") : failed to sync secret cache: timed out waiting for the condition Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.745027 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-key\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.758516 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.775268 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.780976 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8rqm6"] Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.794668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: W0317 00:24:46.805932 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e12abed_0865_4f85_b563_ff72e5a05722.slice/crio-a67d757ebab61ae9ad943f6304c0e75b5f0c3d83b4dee39b03b41499bd543c2b WatchSource:0}: Error finding container a67d757ebab61ae9ad943f6304c0e75b5f0c3d83b4dee39b03b41499bd543c2b: Status 404 returned error can't find the container with id a67d757ebab61ae9ad943f6304c0e75b5f0c3d83b4dee39b03b41499bd543c2b Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.814915 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.835611 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.855673 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.874905 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.894936 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.915345 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.931881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" event={"ID":"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a","Type":"ContainerStarted","Data":"f2b17e432767ea99a49ba9894d064cd9b4e4d6e1b57c8d39d0be2baea6307000"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.931923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" event={"ID":"cb8945e4-2fa1-43f5-94e9-b2a147dfbf9a","Type":"ContainerStarted","Data":"f3588a8e8b4da1367fd39d003363fead749cef41e3af4dafb7b8cc40dd231cb1"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.936313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.937824 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" event={"ID":"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3","Type":"ContainerStarted","Data":"96d07866212aa8bcc559cba1504b665951ec5d9ff3c39525a26b76a4d1db0b11"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.937869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" event={"ID":"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3","Type":"ContainerStarted","Data":"6be78523a547af282ca471f58b4eaf63ae0ddcc1662b81c876255a18714637ad"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.939907 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" event={"ID":"b78bcc96-a38d-4734-8b94-25cddc46b289","Type":"ContainerStarted","Data":"1ca385a8f40d698a02a08560401c13b537c243c8fdb2ec16501fa1e1aaa88ba2"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.939977 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.939993 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" event={"ID":"b78bcc96-a38d-4734-8b94-25cddc46b289","Type":"ContainerStarted","Data":"dfb37be85a662dd1f92fb5c50eae01d031b7d4f32345288cd9193c6a061eed20"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.941880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" event={"ID":"9e12abed-0865-4f85-b563-ff72e5a05722","Type":"ContainerStarted","Data":"3b7c4403c486a09b31c54404885efa836f776d81b6dd700da59d67e20aec2a8b"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.941911 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" event={"ID":"9e12abed-0865-4f85-b563-ff72e5a05722","Type":"ContainerStarted","Data":"a67d757ebab61ae9ad943f6304c0e75b5f0c3d83b4dee39b03b41499bd543c2b"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.942368 4755 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tjxpj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.942526 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.945078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" event={"ID":"d71eafe9-d84b-4a1c-a976-839e5d80bad4","Type":"ContainerStarted","Data":"91631ccae8a573ab4df7f4c4c7be1f6d7c75fcef230f954aec25c330f4158396"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.945120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" event={"ID":"d71eafe9-d84b-4a1c-a976-839e5d80bad4","Type":"ContainerStarted","Data":"ea3ea0f1b91074aec6f64980ef7f54505c6169716bf28ea5c5c0979ec9c3495f"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.947216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" event={"ID":"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b","Type":"ContainerStarted","Data":"f7b6e196d6bc13445df992bb6d9d43ed640c6196844dc8ceebd8b7f4f7491538"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.947247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" event={"ID":"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b","Type":"ContainerStarted","Data":"4b162a7529418ecd58d450f0972f348088429cfb5cc285f1a82ab8b4813349bf"} Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.948376 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.950658 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6bvf8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.950692 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.954995 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.975066 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:24:46 crc kubenswrapper[4755]: I0317 00:24:46.996023 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.015220 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.035564 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.055014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.075179 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.095235 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.115567 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.136374 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.155499 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.174027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.174133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.174226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.174289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.174787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.175479 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.176180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.176221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.176382 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.177019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-config\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.177793 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.178796 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/44c47b93-07a7-414f-9548-ea1dd505c452-cert\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.179115 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.179142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-srv-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.180126 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-serving-cert\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.180746 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4474997a-d442-4f8a-b50e-c6ecab9393f5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.196031 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.201129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0771adb9-eee6-469f-b734-7d28e8a50d41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.215323 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.235751 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.257632 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.276513 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.277816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.277981 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.280192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.278429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-signing-cabundle\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.280769 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.280838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.281669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.281895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc15dc4-de67-4b17-8f95-9d8772166b35-images\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.280904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.282279 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.283121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.296324 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.315981 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.324514 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc15dc4-de67-4b17-8f95-9d8772166b35-proxy-tls\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.335930 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.355475 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.356584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d9a24a-129b-40b0-9fa1-1bd7d595f109-config-volume\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.375649 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.395118 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.401360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13d9a24a-129b-40b0-9fa1-1bd7d595f109-metrics-tls\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.415521 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.418059 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.434259 4755 request.go:700] Waited for 1.918516275s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.436716 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.455780 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.475709 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.488695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-certs\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.496173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.516741 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.529686 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/18513cd3-96dd-41ef-94ba-9f4a629015fb-node-bootstrap-token\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.535013 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.591803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w5wk8\" (UID: \"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.612690 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqndp\" (UniqueName: \"kubernetes.io/projected/64e28e17-1dd0-401e-9b26-7eefc1c54f5f-kube-api-access-nqndp\") pod \"downloads-7954f5f757-8gjxd\" (UID: \"64e28e17-1dd0-401e-9b26-7eefc1c54f5f\") " pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.635025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j594d\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-kube-api-access-j594d\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.641588 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.649522 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.653980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnjcd\" (UniqueName: \"kubernetes.io/projected/18513cd3-96dd-41ef-94ba-9f4a629015fb-kube-api-access-nnjcd\") pod \"machine-config-server-x466w\" (UID: \"18513cd3-96dd-41ef-94ba-9f4a629015fb\") " pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.676169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64nd\" (UniqueName: \"kubernetes.io/projected/65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8-kube-api-access-m64nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-cx96g\" (UID: \"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.687719 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x466w" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.707082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxgq\" (UniqueName: \"kubernetes.io/projected/54a2312d-9a82-4642-aee1-83e701a54908-kube-api-access-dpxgq\") pod \"machine-approver-56656f9798-7p9vc\" (UID: \"54a2312d-9a82-4642-aee1-83e701a54908\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.723932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zrx8\" (UniqueName: \"kubernetes.io/projected/1f07cb15-0d7a-4412-a5e4-5489e61bb8ef-kube-api-access-2zrx8\") pod \"service-ca-operator-777779d784-z4wp4\" (UID: \"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.733723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d755c0ff-4b40-451e-9fb4-4807c6297aaa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qd8g6\" (UID: \"d755c0ff-4b40-451e-9fb4-4807c6297aaa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.749927 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jcp6\" (UniqueName: \"kubernetes.io/projected/0771adb9-eee6-469f-b734-7d28e8a50d41-kube-api-access-4jcp6\") pod \"multus-admission-controller-857f4d67dd-flgvl\" (UID: \"0771adb9-eee6-469f-b734-7d28e8a50d41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.774174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jlr\" (UniqueName: \"kubernetes.io/projected/44c47b93-07a7-414f-9548-ea1dd505c452-kube-api-access-c9jlr\") pod \"ingress-canary-g4dtp\" (UID: \"44c47b93-07a7-414f-9548-ea1dd505c452\") " pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.793463 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvlsg\" (UniqueName: \"kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg\") pod \"auto-csr-approver-29561784-zznp2\" (UID: \"d46784e1-420c-4d3b-aca7-65271a898c44\") " pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.809017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls958\" (UniqueName: \"kubernetes.io/projected/689a5612-54fa-44b9-8b91-1b03d916a159-kube-api-access-ls958\") pod \"openshift-controller-manager-operator-756b6f6bc6-jvsbw\" (UID: \"689a5612-54fa-44b9-8b91-1b03d916a159\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.858023 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8"] Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.862926 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.863161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.863698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6bb8\" (UniqueName: \"kubernetes.io/projected/659184e0-0620-4930-8faa-22e586cf403a-kube-api-access-c6bb8\") pod \"dns-operator-744455d44c-p4wgk\" (UID: \"659184e0-0620-4930-8faa-22e586cf403a\") " pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.890671 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.894330 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.894667 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67sr\" (UniqueName: \"kubernetes.io/projected/b2ec378f-6a5f-48cd-ad14-1445d110a829-kube-api-access-w67sr\") pod \"console-operator-58897d9998-pgc8x\" (UID: \"b2ec378f-6a5f-48cd-ad14-1445d110a829\") " pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.901099 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.913002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4dtp" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.920685 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.922859 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4fwn\" (UniqueName: \"kubernetes.io/projected/3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f-kube-api-access-s4fwn\") pod \"etcd-operator-b45778765-28bml\" (UID: \"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.922978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.933344 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.953105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8mvb\" (UniqueName: \"kubernetes.io/projected/69234b24-fdb7-40bf-828b-104d8d43891c-kube-api-access-g8mvb\") pod \"authentication-operator-69f744f599-7vn7g\" (UID: \"69234b24-fdb7-40bf-828b-104d8d43891c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.953306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8f55\" (UniqueName: \"kubernetes.io/projected/4474997a-d442-4f8a-b50e-c6ecab9393f5-kube-api-access-f8f55\") pod \"olm-operator-6b444d44fb-jcw6v\" (UID: \"4474997a-d442-4f8a-b50e-c6ecab9393f5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.967264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6173ecdb-39f4-4772-9dbd-5fa6e2908971-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7v8ts\" (UID: \"6173ecdb-39f4-4772-9dbd-5fa6e2908971\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.967661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" event={"ID":"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed","Type":"ContainerStarted","Data":"8d02548d51bd775f861364c73b4a12328f70b198e5805e09e40106f5781198a9"} Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.975073 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwtq\" (UniqueName: \"kubernetes.io/projected/1c0bace3-bc39-44fa-9f39-d8f18c07675d-kube-api-access-pmwtq\") pod \"openshift-config-operator-7777fb866f-d4wtw\" (UID: \"1c0bace3-bc39-44fa-9f39-d8f18c07675d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.975938 4755 generic.go:334] "Generic (PLEG): container finished" podID="4faedbc4-02bf-41be-9eaa-2974c1a6b8d3" containerID="96d07866212aa8bcc559cba1504b665951ec5d9ff3c39525a26b76a4d1db0b11" exitCode=0 Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.976659 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" event={"ID":"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3","Type":"ContainerDied","Data":"96d07866212aa8bcc559cba1504b665951ec5d9ff3c39525a26b76a4d1db0b11"} Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.980574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.984045 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8gjxd"] Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.992391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x466w" event={"ID":"18513cd3-96dd-41ef-94ba-9f4a629015fb","Type":"ContainerStarted","Data":"741b03cb52071803f9411189016e57a64023ef1b689d23e41f632438364ce0ab"} Mar 17 00:24:47 crc kubenswrapper[4755]: I0317 00:24:47.992426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x466w" event={"ID":"18513cd3-96dd-41ef-94ba-9f4a629015fb","Type":"ContainerStarted","Data":"d8cded6888377f8e7ba9a0d717ed502952553db6f24006fc699d75d3606d9c55"} Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.002016 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54f5\" (UniqueName: \"kubernetes.io/projected/13d9a24a-129b-40b0-9fa1-1bd7d595f109-kube-api-access-z54f5\") pod \"dns-default-w595z\" (UID: \"13d9a24a-129b-40b0-9fa1-1bd7d595f109\") " pod="openshift-dns/dns-default-w595z" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.007088 4755 generic.go:334] "Generic (PLEG): container finished" podID="d71eafe9-d84b-4a1c-a976-839e5d80bad4" containerID="91631ccae8a573ab4df7f4c4c7be1f6d7c75fcef230f954aec25c330f4158396" exitCode=0 Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.007154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" event={"ID":"d71eafe9-d84b-4a1c-a976-839e5d80bad4","Type":"ContainerDied","Data":"91631ccae8a573ab4df7f4c4c7be1f6d7c75fcef230f954aec25c330f4158396"} Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.007182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" event={"ID":"d71eafe9-d84b-4a1c-a976-839e5d80bad4","Type":"ContainerStarted","Data":"e8247cb9feafb5d216fcbb729e94d29e6f77cf753e6c6294c470a30e4bf3f70c"} Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.009112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" event={"ID":"9e12abed-0865-4f85-b563-ff72e5a05722","Type":"ContainerStarted","Data":"b84e80098b4683f2b816780d057367769271b5e44fda4d1e95771a78e1109b39"} Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.011064 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" event={"ID":"54a2312d-9a82-4642-aee1-83e701a54908","Type":"ContainerStarted","Data":"86027ce170d1034ce9383fcc4b8dcf11e477fa20bcad6102378382d5f8aa059c"} Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.012998 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.015667 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.018326 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.020803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdmd\" (UniqueName: \"kubernetes.io/projected/ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8-kube-api-access-gmdmd\") pod \"router-default-5444994796-7fxgg\" (UID: \"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8\") " pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.027784 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.038352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7jt\" (UniqueName: \"kubernetes.io/projected/a39a0363-2d49-4e67-ac77-a381064d06a0-kube-api-access-pf7jt\") pod \"ingress-operator-5b745b69d9-zm5fc\" (UID: \"a39a0363-2d49-4e67-ac77-a381064d06a0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.052180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh76n\" (UniqueName: \"kubernetes.io/projected/08e826b2-3275-49e6-b833-5494037aac5b-kube-api-access-fh76n\") pod \"control-plane-machine-set-operator-78cbb6b69f-dl89g\" (UID: \"08e826b2-3275-49e6-b833-5494037aac5b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.084767 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.085629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h684j\" (UniqueName: \"kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j\") pod \"cni-sysctl-allowlist-ds-574nd\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.092415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4cf64e3f-992b-4f75-9762-839e4a23633a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6nx7f\" (UID: \"4cf64e3f-992b-4f75-9762-839e4a23633a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.110026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.120175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87c9n\" (UniqueName: \"kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n\") pod \"console-f9d7485db-2fcmx\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.132100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqs4z\" (UniqueName: \"kubernetes.io/projected/18736749-34d6-4ce5-a0ff-e8af0ca22cdc-kube-api-access-jqs4z\") pod \"cluster-samples-operator-665b6dd947-4b88z\" (UID: \"18736749-34d6-4ce5-a0ff-e8af0ca22cdc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.164120 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jrp\" (UniqueName: \"kubernetes.io/projected/998d29bf-29f2-40ff-abd0-6730667f11f6-kube-api-access-j5jrp\") pod \"machine-config-controller-84d6567774-tpxbx\" (UID: \"998d29bf-29f2-40ff-abd0-6730667f11f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.170354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.185820 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnb4f\" (UniqueName: \"kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f\") pod \"oauth-openshift-558db77b4-grcxd\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.197589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx7c4\" (UniqueName: \"kubernetes.io/projected/d48d9cab-c4a2-4727-ad9e-61ae13fe0dff-kube-api-access-xx7c4\") pod \"package-server-manager-789f6589d5-m6qr7\" (UID: \"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.205749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.213265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfmk\" (UniqueName: \"kubernetes.io/projected/f0a4ebae-87f9-4329-bcd7-62cc2e3898fd-kube-api-access-6wfmk\") pod \"service-ca-9c57cc56f-gc6g2\" (UID: \"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd\") " pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.213895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.213915 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.252409 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w595z" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.273635 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.283244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.283677 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.283793 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.289391 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.296472 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56j24\" (UniqueName: \"kubernetes.io/projected/64301027-d188-47a4-a4a3-72b00d2d100a-kube-api-access-56j24\") pod \"migrator-59844c95c7-ktkt5\" (UID: \"64301027-d188-47a4-a4a3-72b00d2d100a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.297357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.304361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8w7m\" (UniqueName: \"kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m\") pod \"image-pruner-29561760-zz9vv\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.305286 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.317000 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnv7q\" (UniqueName: \"kubernetes.io/projected/1347078a-8c1c-4e33-aee1-8aaad00829f2-kube-api-access-qnv7q\") pod \"csi-hostpathplugin-d2qpt\" (UID: \"1347078a-8c1c-4e33-aee1-8aaad00829f2\") " pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.320859 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.342106 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4"] Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.342133 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561784-zznp2"] Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.342449 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.352461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhdw\" (UniqueName: \"kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw\") pod \"marketplace-operator-79b997595-tddtz\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.355125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qm8w\" (UniqueName: \"kubernetes.io/projected/2bc15dc4-de67-4b17-8f95-9d8772166b35-kube-api-access-4qm8w\") pod \"machine-config-operator-74547568cd-vt4qh\" (UID: \"2bc15dc4-de67-4b17-8f95-9d8772166b35\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.361103 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc\") pod \"collect-profiles-29561775-fpcjj\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.392041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.399018 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407483 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407531 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407554 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8tm\" (UniqueName: \"kubernetes.io/projected/f2c60e7f-98a3-459f-bf93-a01feb772b92-kube-api-access-hn8tm\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-srv-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407631 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407652 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8sz\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407681 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-profile-collector-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.407827 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.408137 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:48.90812251 +0000 UTC m=+163.667574793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.410145 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxk4\" (UniqueName: \"kubernetes.io/projected/886294ef-7db0-41dd-854e-76ef9db63e9f-kube-api-access-qdxk4\") pod \"packageserver-d55dfcdfc-qghn4\" (UID: \"886294ef-7db0-41dd-854e-76ef9db63e9f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.417471 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.425660 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.433243 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.514019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.514720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.514769 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.514881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-profile-collector-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.515111 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.015092173 +0000 UTC m=+163.774544456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.515266 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.515489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8tm\" (UniqueName: \"kubernetes.io/projected/f2c60e7f-98a3-459f-bf93-a01feb772b92-kube-api-access-hn8tm\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516167 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-srv-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.516378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8sz\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.522109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.549864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.550328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.550822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.552353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.559338 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.059314171 +0000 UTC m=+163.818766454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.562561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.617786 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.618667 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-profile-collector-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.619607 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2c60e7f-98a3-459f-bf93-a01feb772b92-srv-cert\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.620077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.620814 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.120653527 +0000 UTC m=+163.880105810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.627524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.680904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.684017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8tm\" (UniqueName: \"kubernetes.io/projected/f2c60e7f-98a3-459f-bf93-a01feb772b92-kube-api-access-hn8tm\") pod \"catalog-operator-68c6474976-766pq\" (UID: \"f2c60e7f-98a3-459f-bf93-a01feb772b92\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.687181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8sz\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.712173 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" podStartSLOduration=102.712155678 podStartE2EDuration="1m42.712155678s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:48.672377753 +0000 UTC m=+163.431830036" watchObservedRunningTime="2026-03-17 00:24:48.712155678 +0000 UTC m=+163.471607951" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.722873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.723234 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.223217808 +0000 UTC m=+163.982670091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.730899 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.808604 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" podStartSLOduration=102.808582668 podStartE2EDuration="1m42.808582668s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:48.806547639 +0000 UTC m=+163.565999942" watchObservedRunningTime="2026-03-17 00:24:48.808582668 +0000 UTC m=+163.568034961" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.825203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.825560 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.325542901 +0000 UTC m=+164.084995184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.846534 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.928320 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:48 crc kubenswrapper[4755]: E0317 00:24:48.928690 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.428678742 +0000 UTC m=+164.188131025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:48 crc kubenswrapper[4755]: I0317 00:24:48.960175 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8rqm6" podStartSLOduration=102.960157782 podStartE2EDuration="1m42.960157782s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:48.959298763 +0000 UTC m=+163.718751046" watchObservedRunningTime="2026-03-17 00:24:48.960157782 +0000 UTC m=+163.719610075" Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.029171 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.029451 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.52940153 +0000 UTC m=+164.288853883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.029524 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.029854 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.529845314 +0000 UTC m=+164.289297597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.043565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8gjxd" event={"ID":"64e28e17-1dd0-401e-9b26-7eefc1c54f5f","Type":"ContainerStarted","Data":"67928f1785d351364aaeb03c8b1bae75a49866121600c50c00034fc9db4028c7"} Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.045359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561784-zznp2" event={"ID":"d46784e1-420c-4d3b-aca7-65271a898c44","Type":"ContainerStarted","Data":"c6c61e06012714d07cdb0579e6a2840ee42505b94ad8c4c991f7760e3568c6da"} Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.054287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pgc8x"] Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.054701 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" event={"ID":"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef","Type":"ContainerStarted","Data":"d614623eb89e37178dc023fbfc29ecfca79eea28148040df0b8a24a53e2e6fd4"} Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.056156 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g"] Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.056964 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7fxgg" event={"ID":"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8","Type":"ContainerStarted","Data":"88ed6490000dbe0b7faef2baf25ff94e25566269c2ec124aded7b4b2335b8f77"} Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.059667 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" event={"ID":"183616f4-5ea5-4c31-a465-edb5b837ca8f","Type":"ContainerStarted","Data":"fb0f068cc092d41a711feef64d91c822117bd08edd1488302303a6398a6a78ac"} Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.072653 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bnrkl" podStartSLOduration=104.072633154 podStartE2EDuration="1m44.072633154s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:49.071329979 +0000 UTC m=+163.830782262" watchObservedRunningTime="2026-03-17 00:24:49.072633154 +0000 UTC m=+163.832085437" Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.130959 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.131953 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.631936219 +0000 UTC m=+164.391388502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.143963 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4dtp"] Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.169947 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-flgvl"] Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.234611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.243059 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.743042613 +0000 UTC m=+164.502494896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.261180 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" podStartSLOduration=103.261161865 podStartE2EDuration="1m43.261161865s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:49.223485712 +0000 UTC m=+163.982937995" watchObservedRunningTime="2026-03-17 00:24:49.261161865 +0000 UTC m=+164.020614148" Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.277757 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.336125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.336946 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.836925196 +0000 UTC m=+164.596377479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.354560 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.354542002 podStartE2EDuration="1.354542002s" podCreationTimestamp="2026-03-17 00:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:49.337695433 +0000 UTC m=+164.097147716" watchObservedRunningTime="2026-03-17 00:24:49.354542002 +0000 UTC m=+164.113994285" Mar 17 00:24:49 crc kubenswrapper[4755]: W0317 00:24:49.370691 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0771adb9_eee6_469f_b734_7d28e8a50d41.slice/crio-1db32026ec92851fb26ac651c1e75b4ece3ceaae696982a22c0e3f8d9cb868c7 WatchSource:0}: Error finding container 1db32026ec92851fb26ac651c1e75b4ece3ceaae696982a22c0e3f8d9cb868c7: Status 404 returned error can't find the container with id 1db32026ec92851fb26ac651c1e75b4ece3ceaae696982a22c0e3f8d9cb868c7 Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.439955 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.440767 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:49.940755672 +0000 UTC m=+164.700207955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.544568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.544864 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.044849365 +0000 UTC m=+164.804301648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.646548 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.647239 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.147224959 +0000 UTC m=+164.906677242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.748374 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.748727 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.248711554 +0000 UTC m=+165.008163837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.849601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.849917 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.349903988 +0000 UTC m=+165.109356271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:49 crc kubenswrapper[4755]: I0317 00:24:49.951930 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:49 crc kubenswrapper[4755]: E0317 00:24:49.952595 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.452574292 +0000 UTC m=+165.212026585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.025630 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x466w" podStartSLOduration=5.025610929 podStartE2EDuration="5.025610929s" podCreationTimestamp="2026-03-17 00:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:49.970297011 +0000 UTC m=+164.729749294" watchObservedRunningTime="2026-03-17 00:24:50.025610929 +0000 UTC m=+164.785063212" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.055071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.055367 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.555356261 +0000 UTC m=+165.314808544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.097505 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8gjxd" event={"ID":"64e28e17-1dd0-401e-9b26-7eefc1c54f5f","Type":"ContainerStarted","Data":"b6b23c43ba5c7f7be9a59ce40f92e8152f3e8853470965b9a9394f8d87bc1c96"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.098559 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.106591 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" event={"ID":"54a2312d-9a82-4642-aee1-83e701a54908","Type":"ContainerStarted","Data":"7631eb241bc694c6d911e7a6871deadae47860d887482da9f813603fdfc27794"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.106634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" event={"ID":"54a2312d-9a82-4642-aee1-83e701a54908","Type":"ContainerStarted","Data":"9ee22d615aab1d3e5481234be7466e14771ad97e40d9bdafd516cd1fc827df5a"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.109365 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4dtp" event={"ID":"44c47b93-07a7-414f-9548-ea1dd505c452","Type":"ContainerStarted","Data":"cdf2ca6d9814ceaa20280740ff724345d4d3cd57bbbe16d9baeab7c5f144993e"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.109402 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4dtp" event={"ID":"44c47b93-07a7-414f-9548-ea1dd505c452","Type":"ContainerStarted","Data":"a00b3149c7fcc62a8660a0fc907c43bd87be058f110d8e05925d63c1948d7667"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.110741 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-8gjxd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.110785 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8gjxd" podUID="64e28e17-1dd0-401e-9b26-7eefc1c54f5f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.111522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7fxgg" event={"ID":"ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8","Type":"ContainerStarted","Data":"7e8d9dc2cc8e40ce14143b3d8289a01af157d8a83c4eb23ab91d722ce42338a3"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.134232 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" event={"ID":"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8","Type":"ContainerStarted","Data":"dd83e9f3da1100dd2e55fa060d32f8c427e3e6a059a502a23baae5388c140162"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.134283 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" event={"ID":"65cc0e73-2eb5-47ca-8043-e2cb3ecd8de8","Type":"ContainerStarted","Data":"aaa1a46d2459b04798a9a6c71fb92e880dc8ba2d1eabe174ba9d3d6b0c32b447"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.156112 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.157749 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.657711885 +0000 UTC m=+165.417164178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.166804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.167203 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.66718349 +0000 UTC m=+165.426635773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.170798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" event={"ID":"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3","Type":"ContainerStarted","Data":"f4e0a855988d4edfec718a253ec976c2012541c94d1829b2b7bdd399c91d38a4"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.170841 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" event={"ID":"4faedbc4-02bf-41be-9eaa-2974c1a6b8d3","Type":"ContainerStarted","Data":"b63f5955f3c1ffdd64e728e01a118f3b2ccd070b4ce53e7383c3900e5857dbb4"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.177881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" event={"ID":"0771adb9-eee6-469f-b734-7d28e8a50d41","Type":"ContainerStarted","Data":"1db32026ec92851fb26ac651c1e75b4ece3ceaae696982a22c0e3f8d9cb868c7"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.243095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" event={"ID":"183616f4-5ea5-4c31-a465-edb5b837ca8f","Type":"ContainerStarted","Data":"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.244041 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.259469 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.259641 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" event={"ID":"b2ec378f-6a5f-48cd-ad14-1445d110a829","Type":"ContainerStarted","Data":"23c3571e3825df38fc624e9bdcb0074946d3361f24f7b5c3eee3cf6eeb7bc381"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.259660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" event={"ID":"b2ec378f-6a5f-48cd-ad14-1445d110a829","Type":"ContainerStarted","Data":"acac11853c00e5846695adfaa276d27d8f6d4bf83cbbf1ad2f137ae419c95fbd"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.261758 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.261796 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.267582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" event={"ID":"1f07cb15-0d7a-4412-a5e4-5489e61bb8ef","Type":"ContainerStarted","Data":"175f5d254e5f8ffb44fc36a7ff72b8b3aa675326b7fad936fb74da2cde3d75af"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.272924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.273358 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.773339434 +0000 UTC m=+165.532791717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.274012 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.274324 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.774309938 +0000 UTC m=+165.533762221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.298653 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-pgc8x container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.298709 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" podUID="b2ec378f-6a5f-48cd-ad14-1445d110a829" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.298741 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.305157 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" event={"ID":"bf256ee1-a9b2-4fa7-b25d-09bbf36d9aed","Type":"ContainerStarted","Data":"2603394888b8a291ab2fac258ecfba15d6d2db654ba23da3d4c84439eb7aea93"} Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.349049 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w595z"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.360290 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.370324 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p4wgk"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.374515 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.375665 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.875650617 +0000 UTC m=+165.635102900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.408960 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-28bml"] Mar 17 00:24:50 crc kubenswrapper[4755]: W0317 00:24:50.447555 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod659184e0_0620_4930_8faa_22e586cf403a.slice/crio-2e39ba77599e6c6c070b5cd6d0e0e3ea1bdf8b82962b659381ed66d1f13be995 WatchSource:0}: Error finding container 2e39ba77599e6c6c070b5cd6d0e0e3ea1bdf8b82962b659381ed66d1f13be995: Status 404 returned error can't find the container with id 2e39ba77599e6c6c070b5cd6d0e0e3ea1bdf8b82962b659381ed66d1f13be995 Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.476496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.476834 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:50.976821901 +0000 UTC m=+165.736274184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.499616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.501013 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7fxgg" podStartSLOduration=104.5009966 podStartE2EDuration="1m44.5009966s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.496598999 +0000 UTC m=+165.256051282" watchObservedRunningTime="2026-03-17 00:24:50.5009966 +0000 UTC m=+165.260448883" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.525959 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wp4" podStartSLOduration=104.525939597 podStartE2EDuration="1m44.525939597s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.523604036 +0000 UTC m=+165.283056319" watchObservedRunningTime="2026-03-17 00:24:50.525939597 +0000 UTC m=+165.285391880" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.571082 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cx96g" podStartSLOduration=104.571058636 podStartE2EDuration="1m44.571058636s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.564143638 +0000 UTC m=+165.323595921" watchObservedRunningTime="2026-03-17 00:24:50.571058636 +0000 UTC m=+165.330510939" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.580046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.580674 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.080654555 +0000 UTC m=+165.840106848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.619162 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" podStartSLOduration=105.619146566 podStartE2EDuration="1m45.619146566s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.616562987 +0000 UTC m=+165.376015270" watchObservedRunningTime="2026-03-17 00:24:50.619146566 +0000 UTC m=+165.378598849" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.651725 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.666182 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.678756 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.684004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.684293 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.184281573 +0000 UTC m=+165.943733856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.685619 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vn7g"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.718989 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8gjxd" podStartSLOduration=104.718970734 podStartE2EDuration="1m44.718970734s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.671216923 +0000 UTC m=+165.430669206" watchObservedRunningTime="2026-03-17 00:24:50.718970734 +0000 UTC m=+165.478423017" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.721691 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.721750 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.732654 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d2qpt"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.745639 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.746088 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7p9vc" podStartSLOduration=105.746069904 podStartE2EDuration="1m45.746069904s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.714354105 +0000 UTC m=+165.473806388" watchObservedRunningTime="2026-03-17 00:24:50.746069904 +0000 UTC m=+165.505522187" Mar 17 00:24:50 crc kubenswrapper[4755]: W0317 00:24:50.752630 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc15dc4_de67_4b17_8f95_9d8772166b35.slice/crio-12d961954a4b6e3058afea46ccaba4c59605f005f193e7050149b457162f1de3 WatchSource:0}: Error finding container 12d961954a4b6e3058afea46ccaba4c59605f005f193e7050149b457162f1de3: Status 404 returned error can't find the container with id 12d961954a4b6e3058afea46ccaba4c59605f005f193e7050149b457162f1de3 Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.753965 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4"] Mar 17 00:24:50 crc kubenswrapper[4755]: W0317 00:24:50.766906 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39a0363_2d49_4e67_ac77_a381064d06a0.slice/crio-8259d330ad0ac3764072a303d664c20d640615609d09003d87410ae0de68cc40 WatchSource:0}: Error finding container 8259d330ad0ac3764072a303d664c20d640615609d09003d87410ae0de68cc40: Status 404 returned error can't find the container with id 8259d330ad0ac3764072a303d664c20d640615609d09003d87410ae0de68cc40 Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.781991 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.782045 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.786448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.786977 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.286958307 +0000 UTC m=+166.046410590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.787013 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.787426 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-grcxd"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.788524 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" podStartSLOduration=5.788511311 podStartE2EDuration="5.788511311s" podCreationTimestamp="2026-03-17 00:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.777259234 +0000 UTC m=+165.536711517" watchObservedRunningTime="2026-03-17 00:24:50.788511311 +0000 UTC m=+165.547963594" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.796940 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.802371 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.812749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.813373 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29561760-zz9vv"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.816627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.826925 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" podStartSLOduration=104.826906849 podStartE2EDuration="1m44.826906849s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.820751147 +0000 UTC m=+165.580203430" watchObservedRunningTime="2026-03-17 00:24:50.826906849 +0000 UTC m=+165.586359132" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.851195 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g4dtp" podStartSLOduration=5.851178612 podStartE2EDuration="5.851178612s" podCreationTimestamp="2026-03-17 00:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.84702973 +0000 UTC m=+165.606482013" watchObservedRunningTime="2026-03-17 00:24:50.851178612 +0000 UTC m=+165.610630895" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.851227 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gc6g2"] Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.887817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.887621 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.887600422 podStartE2EDuration="1.887600422s" podCreationTimestamp="2026-03-17 00:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.861106423 +0000 UTC m=+165.620558706" watchObservedRunningTime="2026-03-17 00:24:50.887600422 +0000 UTC m=+165.647052705" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.888179 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.388167102 +0000 UTC m=+166.147619385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.953170 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w5wk8" podStartSLOduration=104.953122432 podStartE2EDuration="1m44.953122432s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:50.888264825 +0000 UTC m=+165.647717108" watchObservedRunningTime="2026-03-17 00:24:50.953122432 +0000 UTC m=+165.712574715" Mar 17 00:24:50 crc kubenswrapper[4755]: W0317 00:24:50.954323 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c4ceac_e07e_407f_a2d7_5202cc06c29d.slice/crio-e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf WatchSource:0}: Error finding container e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf: Status 404 returned error can't find the container with id e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.990467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.990704 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.490675952 +0000 UTC m=+166.250128235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:50 crc kubenswrapper[4755]: I0317 00:24:50.991093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:50 crc kubenswrapper[4755]: E0317 00:24:50.991663 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.491638704 +0000 UTC m=+166.251090987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.092758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.093092 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.593076527 +0000 UTC m=+166.352528810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.113348 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.120882 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:51 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:51 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:51 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.120934 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.193935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.194480 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.694468807 +0000 UTC m=+166.453921090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.247297 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-574nd"] Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.295420 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.295580 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.795555467 +0000 UTC m=+166.555007750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.295623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.296144 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.796138008 +0000 UTC m=+166.555590281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.321358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w595z" event={"ID":"13d9a24a-129b-40b0-9fa1-1bd7d595f109","Type":"ContainerStarted","Data":"928f146f1051fac366e54f037b28a069e4b74cdfaa9d562bcb295fd176e0a7a0"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.321413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w595z" event={"ID":"13d9a24a-129b-40b0-9fa1-1bd7d595f109","Type":"ContainerStarted","Data":"799355c2632f84a333b884626482a49fc3f0e0251cfda6373d62487af7f715c6"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.323094 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" event={"ID":"4474997a-d442-4f8a-b50e-c6ecab9393f5","Type":"ContainerStarted","Data":"072759f39659bbf6548d4b0a133329e7eecc68725e8d5d9c440c118bc24f964a"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.323146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" event={"ID":"4474997a-d442-4f8a-b50e-c6ecab9393f5","Type":"ContainerStarted","Data":"3bd8b70d900e8287764e1bfe4f45ff514d8f22eefa6659b74c158d6e56ab3c27"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.323359 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.327211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" event={"ID":"0771adb9-eee6-469f-b734-7d28e8a50d41","Type":"ContainerStarted","Data":"3c8c8b7c444356fd86d50cdb62eacd450bd1079ddc174b2c30711e3a03e438c7"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.327248 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" event={"ID":"0771adb9-eee6-469f-b734-7d28e8a50d41","Type":"ContainerStarted","Data":"3fa9ee338dbcd58cc19de9095af309cea9092cabff860dbcafafc9b28a5c9d44"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.328292 4755 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jcw6v container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.328337 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" podUID="4474997a-d442-4f8a-b50e-c6ecab9393f5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.331637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" event={"ID":"69234b24-fdb7-40bf-828b-104d8d43891c","Type":"ContainerStarted","Data":"1ac0be601e8a2620e0d88cd450034f45caa1a2f02e791d4af6e2adcc459c49a3"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.342742 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" event={"ID":"689a5612-54fa-44b9-8b91-1b03d916a159","Type":"ContainerStarted","Data":"3f17ad7622457692b17a82844858c737a700840815caba89222b5b810cf049e6"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.342796 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" event={"ID":"689a5612-54fa-44b9-8b91-1b03d916a159","Type":"ContainerStarted","Data":"fb5601b8acca5c63623589b839d6380505673920f8184e845753dfddbdd09502"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.349608 4755 ???:1] "http: TLS handshake error from 192.168.126.11:58920: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.350245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" event={"ID":"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff","Type":"ContainerStarted","Data":"586ca4a6bf0ec4c3c0ddd4cd923f480395d1187932185da365ec53045c20af66"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.351539 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" event={"ID":"18736749-34d6-4ce5-a0ff-e8af0ca22cdc","Type":"ContainerStarted","Data":"cc66ce62eacc891c85920507c32ea25580222bdf6bd95518cbc4a9055dcbd0bb"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.352786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" event={"ID":"d755c0ff-4b40-451e-9fb4-4807c6297aaa","Type":"ContainerStarted","Data":"47b379dde076d65979f0bb4759bfe253f3fee7f5318bc761fcc9665a4ba68e2b"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.352810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" event={"ID":"d755c0ff-4b40-451e-9fb4-4807c6297aaa","Type":"ContainerStarted","Data":"cdda9aaf34b82966bc9ecdc219a175df350206b5d830b09fee85ffe292ad9364"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.359502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" event={"ID":"886294ef-7db0-41dd-854e-76ef9db63e9f","Type":"ContainerStarted","Data":"bcef52843a22de4572223f705482b8c8a18a76fe61892fec848f5dc495b30a89"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.362978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" event={"ID":"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f","Type":"ContainerStarted","Data":"29be7c04ff6ca9d152e30517d5eb6597b711a8c9fba8ba6920a7a44dbc8bc793"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.363023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" event={"ID":"3d9c83e6-1f6e-44f2-9724-d8f6b5b7351f","Type":"ContainerStarted","Data":"471e7fdb674cfdfcc6b33c0532da3521a1ecd1f94b7ece64d348cf52bca21136"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.365137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" event={"ID":"998d29bf-29f2-40ff-abd0-6730667f11f6","Type":"ContainerStarted","Data":"d83c5f5d04db96c9f6713328a474934cda5a1890bb04a57599a277d659edfecc"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.376359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" event={"ID":"6173ecdb-39f4-4772-9dbd-5fa6e2908971","Type":"ContainerStarted","Data":"3198fd14560f63a61ec6004c5fcf9b309e2ac40fbce743e073ed950007701f4a"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.376408 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" event={"ID":"6173ecdb-39f4-4772-9dbd-5fa6e2908971","Type":"ContainerStarted","Data":"20c4125fce160bd8cf0fccd948231f31eaeb7948d29629f80ff06581d3ae04f0"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.384986 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.385041 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.386867 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" event={"ID":"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd","Type":"ContainerStarted","Data":"77cfca7610a1834ddae41c49b1185acff09dfdfe5d7dbc75bc906609aab75e03"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.391497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" event={"ID":"50008856-a0a3-4ec3-a48f-5f90891d777e","Type":"ContainerStarted","Data":"a6b0e51e3183daba6fa35cede43212e55802eb61c7dfbd0e017fb3c350a71733"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.392536 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.392899 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" event={"ID":"4cf64e3f-992b-4f75-9762-839e4a23633a","Type":"ContainerStarted","Data":"c838bfd3e4451075a3c9691e55ad4012bd760a12eeb0411f0d86f91723c47cb6"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.393891 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" event={"ID":"659184e0-0620-4930-8faa-22e586cf403a","Type":"ContainerStarted","Data":"a3a590fe11f623c48809043878ba52ee8d5105170a550c7d6b085e9acb480e92"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.393941 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" event={"ID":"659184e0-0620-4930-8faa-22e586cf403a","Type":"ContainerStarted","Data":"2e39ba77599e6c6c070b5cd6d0e0e3ea1bdf8b82962b659381ed66d1f13be995"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.396134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" event={"ID":"64301027-d188-47a4-a4a3-72b00d2d100a","Type":"ContainerStarted","Data":"2306f88fb6462704cda7d6ae2ee43bd172ef0f9003a83f3fa451d995b0e842b0"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.397264 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" event={"ID":"f2c60e7f-98a3-459f-bf93-a01feb772b92","Type":"ContainerStarted","Data":"0146115bec36ea23d33fa49211976491e6c5071dfa09c76d7d95be7267901a8a"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.398693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" event={"ID":"1c0bace3-bc39-44fa-9f39-d8f18c07675d","Type":"ContainerStarted","Data":"bd6563fa6f4517d657831496014280014065710b67daf27d43b7fc97f4b679e1"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.400015 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" event={"ID":"7d67a491-1c7f-4898-bc78-a2a7d75278dc","Type":"ContainerStarted","Data":"619fb26e1dc3d5a0deb3ec28a94ebb9285312a914ad71e648572c1abb5eed11d"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.401344 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.401958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" event={"ID":"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c","Type":"ContainerStarted","Data":"bdca036085df50bb592873f4f9e4b6b2c6f7539089889ae64ddb1b8f241345b3"} Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.402486 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:51.902467588 +0000 UTC m=+166.661919871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.404867 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29561760-zz9vv" event={"ID":"92c4ceac-e07e-407f-a2d7-5202cc06c29d","Type":"ContainerStarted","Data":"e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.407540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fcmx" event={"ID":"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e","Type":"ContainerStarted","Data":"72bb8e3c66943a62162884bee5ed444e1ec71deaa3b89e937765d426ed257e4f"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.413469 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" event={"ID":"2bc15dc4-de67-4b17-8f95-9d8772166b35","Type":"ContainerStarted","Data":"12d961954a4b6e3058afea46ccaba4c59605f005f193e7050149b457162f1de3"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.415337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" event={"ID":"1347078a-8c1c-4e33-aee1-8aaad00829f2","Type":"ContainerStarted","Data":"a1037f8309383eb2b0e35285e5ecd0a552cbf8bb114edc768f3b778de6bb5271"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.417246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" event={"ID":"08e826b2-3275-49e6-b833-5494037aac5b","Type":"ContainerStarted","Data":"c4c14fd6a5e5f119893ad8700dccaf510809fa3d050ca4fdbabd07537b6a0ba5"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.418770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" event={"ID":"a39a0363-2d49-4e67-ac77-a381064d06a0","Type":"ContainerStarted","Data":"8259d330ad0ac3764072a303d664c20d640615609d09003d87410ae0de68cc40"} Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.420292 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-8gjxd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.420323 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8gjxd" podUID="64e28e17-1dd0-401e-9b26-7eefc1c54f5f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.449632 4755 ???:1] "http: TLS handshake error from 192.168.126.11:58932: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.506489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.510826 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.010810568 +0000 UTC m=+166.770262851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.512785 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.513192 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.545855 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53624: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.607850 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.608391 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.108349716 +0000 UTC m=+166.867801999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.656669 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53630: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.710845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.719137 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.219121879 +0000 UTC m=+166.978574162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.749017 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pgc8x" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.760146 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53646: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.822200 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.822632 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.322611552 +0000 UTC m=+167.082063845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.882510 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53662: no serving certificate available for the kubelet" Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.923413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:51 crc kubenswrapper[4755]: E0317 00:24:51.924094 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.424082665 +0000 UTC m=+167.183534948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:51 crc kubenswrapper[4755]: I0317 00:24:51.991115 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" podStartSLOduration=105.991098306 podStartE2EDuration="1m45.991098306s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:51.988151135 +0000 UTC m=+166.747603418" watchObservedRunningTime="2026-03-17 00:24:51.991098306 +0000 UTC m=+166.750550589" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.025899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.026233 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.526218022 +0000 UTC m=+167.285670305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.072540 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qd8g6" podStartSLOduration=106.072523011 podStartE2EDuration="1m46.072523011s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.039996315 +0000 UTC m=+166.799448598" watchObservedRunningTime="2026-03-17 00:24:52.072523011 +0000 UTC m=+166.831975294" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.072666 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-flgvl" podStartSLOduration=106.072661386 podStartE2EDuration="1m46.072661386s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.071351681 +0000 UTC m=+166.830803974" watchObservedRunningTime="2026-03-17 00:24:52.072661386 +0000 UTC m=+166.832113669" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.081869 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53664: no serving certificate available for the kubelet" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.114255 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:52 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:52 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:52 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.114304 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.127509 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.127865 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.627853551 +0000 UTC m=+167.387305834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.230722 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.231310 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.731293673 +0000 UTC m=+167.490745956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.247613 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-28bml" podStartSLOduration=106.247596192 podStartE2EDuration="1m46.247596192s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.199020335 +0000 UTC m=+166.958472618" watchObservedRunningTime="2026-03-17 00:24:52.247596192 +0000 UTC m=+167.007048475" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.263303 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jvsbw" podStartSLOduration=106.263275211 podStartE2EDuration="1m46.263275211s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.246556657 +0000 UTC m=+167.006008940" watchObservedRunningTime="2026-03-17 00:24:52.263275211 +0000 UTC m=+167.022727494" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.286986 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7v8ts" podStartSLOduration=106.286963023 podStartE2EDuration="1m46.286963023s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.284879132 +0000 UTC m=+167.044331415" watchObservedRunningTime="2026-03-17 00:24:52.286963023 +0000 UTC m=+167.046415306" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.333046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.333491 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.833472311 +0000 UTC m=+167.592924614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.339722 4755 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lc64n container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]log ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]etcd ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/max-in-flight-filter ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 17 00:24:52 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 17 00:24:52 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectcache ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-startinformers ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 17 00:24:52 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 17 00:24:52 crc kubenswrapper[4755]: livez check failed Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.339796 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" podUID="4faedbc4-02bf-41be-9eaa-2974c1a6b8d3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.437848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.438220 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:52.938204845 +0000 UTC m=+167.697657128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.440148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" event={"ID":"a39a0363-2d49-4e67-ac77-a381064d06a0","Type":"ContainerStarted","Data":"e1a773784f2598eb3b66294f228312718ca4d693cefd3e2af67549acaead0f7f"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.440181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" event={"ID":"a39a0363-2d49-4e67-ac77-a381064d06a0","Type":"ContainerStarted","Data":"f15fddf86488a24b33de237884a6d8093ffc18a25e66a5a10e9b469f2df10beb"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.442409 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29561760-zz9vv" event={"ID":"92c4ceac-e07e-407f-a2d7-5202cc06c29d","Type":"ContainerStarted","Data":"fd0a79ec53683bc6a3105fdcc8c96ab6436047e3c0eb29b421cc78256d3cb84d"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.449784 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fcmx" event={"ID":"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e","Type":"ContainerStarted","Data":"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.486947 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zm5fc" podStartSLOduration=106.486928388 podStartE2EDuration="1m46.486928388s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.463362229 +0000 UTC m=+167.222814542" watchObservedRunningTime="2026-03-17 00:24:52.486928388 +0000 UTC m=+167.246380661" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.488226 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29561760-zz9vv" podStartSLOduration=107.488220253 podStartE2EDuration="1m47.488220253s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.484696442 +0000 UTC m=+167.244148715" watchObservedRunningTime="2026-03-17 00:24:52.488220253 +0000 UTC m=+167.247672526" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.488883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" event={"ID":"1c0bace3-bc39-44fa-9f39-d8f18c07675d","Type":"ContainerStarted","Data":"e3d59661d245cb4346d41ff40c5eb1a805d8766cca6c3bb24513818ce775bea2"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.513218 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53670: no serving certificate available for the kubelet" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.524240 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" event={"ID":"7d67a491-1c7f-4898-bc78-a2a7d75278dc","Type":"ContainerStarted","Data":"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.525589 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.527577 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tddtz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.527649 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.538265 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2fcmx" podStartSLOduration=106.53824707 podStartE2EDuration="1m46.53824707s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.537857297 +0000 UTC m=+167.297309580" watchObservedRunningTime="2026-03-17 00:24:52.53824707 +0000 UTC m=+167.297699353" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.541149 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.545912 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.045892112 +0000 UTC m=+167.805344395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.570605 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" event={"ID":"69234b24-fdb7-40bf-828b-104d8d43891c","Type":"ContainerStarted","Data":"0d70fc96e7449172570eed06355366c4661f534d2460723884a543f78dab30bd"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.598687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" event={"ID":"f2c60e7f-98a3-459f-bf93-a01feb772b92","Type":"ContainerStarted","Data":"ec1a3214d001e3a2102392f77285b38f7971a6dfd9fb6e013889d35adcfc2471"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.599686 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.611951 4755 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-766pq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.612015 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" podUID="f2c60e7f-98a3-459f-bf93-a01feb772b92" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.624700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" event={"ID":"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff","Type":"ContainerStarted","Data":"9acfa7475a4e5c136ac36d79cb6668080a800539813d873489c6bcde6be385a6"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.624759 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" event={"ID":"d48d9cab-c4a2-4727-ad9e-61ae13fe0dff","Type":"ContainerStarted","Data":"e764d5a1e4567c9370aecafc4b3547ecfcbb3dfd51b13644cf295f77a3a1d5b0"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.625500 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.638537 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" podStartSLOduration=106.638520903 podStartE2EDuration="1m46.638520903s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.588783845 +0000 UTC m=+167.348236128" watchObservedRunningTime="2026-03-17 00:24:52.638520903 +0000 UTC m=+167.397973186" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.644737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" event={"ID":"18736749-34d6-4ce5-a0ff-e8af0ca22cdc","Type":"ContainerStarted","Data":"56d7917af71761cfa22d2729672069694bd1b03f9137406d7fd9e357290d6c55"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.655007 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.656873 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.156854042 +0000 UTC m=+167.916306325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.661681 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" event={"ID":"f0a4ebae-87f9-4329-bcd7-62cc2e3898fd","Type":"ContainerStarted","Data":"ecab3974603ad44c9f00371c57f54248a4f711f7cf74adb98725e34a34ea2d7d"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.698768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" event={"ID":"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c","Type":"ContainerStarted","Data":"70e4c1def5018c3b2f625393e2a282b6b3c37f06d9f5aa26002edce2d6f2d7e1"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.716688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" event={"ID":"64301027-d188-47a4-a4a3-72b00d2d100a","Type":"ContainerStarted","Data":"95144a5b2eb1893050f74b91d81bba3838da9f41098c61ae9cae20c3308d3398"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.739372 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vn7g" podStartSLOduration=107.739358205 podStartE2EDuration="1m47.739358205s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.68157278 +0000 UTC m=+167.441025063" watchObservedRunningTime="2026-03-17 00:24:52.739358205 +0000 UTC m=+167.498810488" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.740015 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" podStartSLOduration=106.740009417 podStartE2EDuration="1m46.740009417s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.737995038 +0000 UTC m=+167.497447321" watchObservedRunningTime="2026-03-17 00:24:52.740009417 +0000 UTC m=+167.499461700" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.747698 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" event={"ID":"998d29bf-29f2-40ff-abd0-6730667f11f6","Type":"ContainerStarted","Data":"432aaa7ca8db744d353cd7fbbd314bed5b15d2cfc8b3dbe547be85a41881ce97"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.759341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.760768 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.260754489 +0000 UTC m=+168.020206762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.786947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" event={"ID":"08e826b2-3275-49e6-b833-5494037aac5b","Type":"ContainerStarted","Data":"d1b468cd8c8bfe971365d4677e5b00233a0da4a0c090cbf492d68591f7d70c51"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.788769 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" podStartSLOduration=106.788758911 podStartE2EDuration="1m46.788758911s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.779836594 +0000 UTC m=+167.539288877" watchObservedRunningTime="2026-03-17 00:24:52.788758911 +0000 UTC m=+167.548211194" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.810806 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" event={"ID":"2bc15dc4-de67-4b17-8f95-9d8772166b35","Type":"ContainerStarted","Data":"cb4d591ae3e93c76b6c92ef7ad290e242f9d814790ee5c1b113f18332d68beef"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.810848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" event={"ID":"2bc15dc4-de67-4b17-8f95-9d8772166b35","Type":"ContainerStarted","Data":"60eabc3fab9bd560941aec8aba270189d946e14fe08c5ab7f83f36b48fbc9e4a"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.813005 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" event={"ID":"886294ef-7db0-41dd-854e-76ef9db63e9f","Type":"ContainerStarted","Data":"84c90ce53cc730804b847e76af41d9821bdc4bb041a48ebd3a58398c74221293"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.813728 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.816044 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" gracePeriod=30 Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.816649 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qghn4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.816692 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" podUID="886294ef-7db0-41dd-854e-76ef9db63e9f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.816808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" event={"ID":"50008856-a0a3-4ec3-a48f-5f90891d777e","Type":"ContainerStarted","Data":"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69"} Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.816881 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.818228 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-8gjxd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.818253 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8gjxd" podUID="64e28e17-1dd0-401e-9b26-7eefc1c54f5f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.822353 4755 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-grcxd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.822406 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.822905 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fnm9k" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.833394 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" podStartSLOduration=106.833374742 podStartE2EDuration="1m46.833374742s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.832931927 +0000 UTC m=+167.592384210" watchObservedRunningTime="2026-03-17 00:24:52.833374742 +0000 UTC m=+167.592827025" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.844320 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jcw6v" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.862365 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.863235 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" podStartSLOduration=106.863220037 podStartE2EDuration="1m46.863220037s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.861776367 +0000 UTC m=+167.621228640" watchObservedRunningTime="2026-03-17 00:24:52.863220037 +0000 UTC m=+167.622672320" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.864413 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.364382957 +0000 UTC m=+168.123835240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.889302 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" podStartSLOduration=106.889284572 podStartE2EDuration="1m46.889284572s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.885413429 +0000 UTC m=+167.644865702" watchObservedRunningTime="2026-03-17 00:24:52.889284572 +0000 UTC m=+167.648736855" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.927574 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gc6g2" podStartSLOduration=106.927553125 podStartE2EDuration="1m46.927553125s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.910453879 +0000 UTC m=+167.669906162" watchObservedRunningTime="2026-03-17 00:24:52.927553125 +0000 UTC m=+167.687005418" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.931177 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" podStartSLOduration=106.93116573 podStartE2EDuration="1m46.93116573s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.924952616 +0000 UTC m=+167.684404909" watchObservedRunningTime="2026-03-17 00:24:52.93116573 +0000 UTC m=+167.690618013" Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.971172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:52 crc kubenswrapper[4755]: E0317 00:24:52.971671 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.471661449 +0000 UTC m=+168.231113722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:52 crc kubenswrapper[4755]: I0317 00:24:52.983476 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vt4qh" podStartSLOduration=106.983460325 podStartE2EDuration="1m46.983460325s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:52.980943988 +0000 UTC m=+167.740396271" watchObservedRunningTime="2026-03-17 00:24:52.983460325 +0000 UTC m=+167.742912608" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.018695 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" podStartSLOduration=107.018672934 podStartE2EDuration="1m47.018672934s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:53.015581748 +0000 UTC m=+167.775034031" watchObservedRunningTime="2026-03-17 00:24:53.018672934 +0000 UTC m=+167.778125207" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.071845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.072282 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.572262403 +0000 UTC m=+168.331714676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.085419 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" podStartSLOduration=108.085400085 podStartE2EDuration="1m48.085400085s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:53.055459476 +0000 UTC m=+167.814911759" watchObservedRunningTime="2026-03-17 00:24:53.085400085 +0000 UTC m=+167.844852368" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.121066 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:53 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:53 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:53 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.121454 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.173923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.179141 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.679126852 +0000 UTC m=+168.438579135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.202198 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53680: no serving certificate available for the kubelet" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.280269 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.280377 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.780359348 +0000 UTC m=+168.539811631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.280714 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.281013 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.781000579 +0000 UTC m=+168.540452862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.365001 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dl89g" podStartSLOduration=107.364984542 podStartE2EDuration="1m47.364984542s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:53.146977669 +0000 UTC m=+167.906429952" watchObservedRunningTime="2026-03-17 00:24:53.364984542 +0000 UTC m=+168.124436825" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.366804 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.366985 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerName="controller-manager" containerID="cri-o://f7b6e196d6bc13445df992bb6d9d43ed640c6196844dc8ceebd8b7f4f7491538" gracePeriod=30 Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.382052 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.382275 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.882241445 +0000 UTC m=+168.641693728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.407170 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.407403 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerName="route-controller-manager" containerID="cri-o://1ca385a8f40d698a02a08560401c13b537c243c8fdb2ec16501fa1e1aaa88ba2" gracePeriod=30 Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.483880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.484166 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:53.984155943 +0000 UTC m=+168.743608226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.584813 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.585001 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.084955164 +0000 UTC m=+168.844407437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.585394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.585701 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.08569305 +0000 UTC m=+168.845145333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.687448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.687627 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.187599618 +0000 UTC m=+168.947051901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.687675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.688049 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.188037543 +0000 UTC m=+168.947489826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.788420 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.788785 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.288770401 +0000 UTC m=+169.048222684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.890233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:53 crc kubenswrapper[4755]: E0317 00:24:53.890630 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.390605698 +0000 UTC m=+169.150057981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.897692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" event={"ID":"18736749-34d6-4ce5-a0ff-e8af0ca22cdc","Type":"ContainerStarted","Data":"eb33a8c19fa96eb6dc749c643ec48c6c20c56e68dd3af927dbc9ea9b321d324e"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.903676 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.907358 4755 generic.go:334] "Generic (PLEG): container finished" podID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerID="1ca385a8f40d698a02a08560401c13b537c243c8fdb2ec16501fa1e1aaa88ba2" exitCode=0 Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.907478 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" event={"ID":"b78bcc96-a38d-4734-8b94-25cddc46b289","Type":"ContainerDied","Data":"1ca385a8f40d698a02a08560401c13b537c243c8fdb2ec16501fa1e1aaa88ba2"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.910818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6nx7f" event={"ID":"4cf64e3f-992b-4f75-9762-839e4a23633a","Type":"ContainerStarted","Data":"c8d626ca9bdea8f09b04d53ea4bc01f0790d590759afc07897cbc05dac758c6e"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.916628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tpxbx" event={"ID":"998d29bf-29f2-40ff-abd0-6730667f11f6","Type":"ContainerStarted","Data":"daeb8755a96a6a2127074d156e495a7596aebf47685061e90def4598ddfdc0e0"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.919805 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" podStartSLOduration=108.91978472 podStartE2EDuration="1m48.91978472s" podCreationTimestamp="2026-03-17 00:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:53.916212937 +0000 UTC m=+168.675665220" watchObservedRunningTime="2026-03-17 00:24:53.91978472 +0000 UTC m=+168.679237003" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.922557 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" event={"ID":"1347078a-8c1c-4e33-aee1-8aaad00829f2","Type":"ContainerStarted","Data":"1ce24d63b68daae325f487debced867954d847c7d10e4a348abc60674585a08d"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.942820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ktkt5" event={"ID":"64301027-d188-47a4-a4a3-72b00d2d100a","Type":"ContainerStarted","Data":"11481b5ab2136a9e92dddad10f7ccb68ec85b184407d34cdee34d3bd7690c8ac"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.963121 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerID="f7b6e196d6bc13445df992bb6d9d43ed640c6196844dc8ceebd8b7f4f7491538" exitCode=0 Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.963226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" event={"ID":"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b","Type":"ContainerDied","Data":"f7b6e196d6bc13445df992bb6d9d43ed640c6196844dc8ceebd8b7f4f7491538"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.963285 4755 scope.go:117] "RemoveContainer" containerID="f7b6e196d6bc13445df992bb6d9d43ed640c6196844dc8ceebd8b7f4f7491538" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.963479 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6bvf8" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.966853 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.967238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w595z" event={"ID":"13d9a24a-129b-40b0-9fa1-1bd7d595f109","Type":"ContainerStarted","Data":"47e91271dfb529b51ce971e374ad0ffb35020d6ccb06a316cda9128537c98517"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.967856 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-w595z" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.971488 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" event={"ID":"659184e0-0620-4930-8faa-22e586cf403a","Type":"ContainerStarted","Data":"7830159f2d309211c766735722ad119c2bbfd7ba62467a52811ef4642bb0d736"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.987669 4755 generic.go:334] "Generic (PLEG): container finished" podID="1c0bace3-bc39-44fa-9f39-d8f18c07675d" containerID="e3d59661d245cb4346d41ff40c5eb1a805d8766cca6c3bb24513818ce775bea2" exitCode=0 Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.987814 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" event={"ID":"1c0bace3-bc39-44fa-9f39-d8f18c07675d","Type":"ContainerDied","Data":"e3d59661d245cb4346d41ff40c5eb1a805d8766cca6c3bb24513818ce775bea2"} Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles\") pod \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993506 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993599 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config\") pod \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993652 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert\") pod \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993686 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca\") pod \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.993714 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrrzk\" (UniqueName: \"kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk\") pod \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\" (UID: \"ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b\") " Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.995947 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" (UID: "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.996421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" (UID: "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.996425 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tddtz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.996488 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 17 00:24:53 crc kubenswrapper[4755]: I0317 00:24:53.996791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config" (OuterVolumeSpecName: "config") pod "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" (UID: "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.000324 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.500305103 +0000 UTC m=+169.259757466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.008840 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-766pq" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.009310 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" (UID: "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.011077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk" (OuterVolumeSpecName: "kube-api-access-wrrzk") pod "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" (UID: "ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b"). InnerVolumeSpecName "kube-api-access-wrrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.047565 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w595z" podStartSLOduration=9.047549036 podStartE2EDuration="9.047549036s" podCreationTimestamp="2026-03-17 00:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:54.023739358 +0000 UTC m=+168.783191661" watchObservedRunningTime="2026-03-17 00:24:54.047549036 +0000 UTC m=+168.807001319" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.089561 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p4wgk" podStartSLOduration=108.089546208 podStartE2EDuration="1m48.089546208s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:54.086806483 +0000 UTC m=+168.846258766" watchObservedRunningTime="2026-03-17 00:24:54.089546208 +0000 UTC m=+168.848998491" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.095976 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmmg6\" (UniqueName: \"kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6\") pod \"b78bcc96-a38d-4734-8b94-25cddc46b289\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.096022 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert\") pod \"b78bcc96-a38d-4734-8b94-25cddc46b289\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.096150 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config\") pod \"b78bcc96-a38d-4734-8b94-25cddc46b289\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.096209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca\") pod \"b78bcc96-a38d-4734-8b94-25cddc46b289\" (UID: \"b78bcc96-a38d-4734-8b94-25cddc46b289\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.096892 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.097241 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.097258 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.097267 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.097277 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrrzk\" (UniqueName: \"kubernetes.io/projected/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-kube-api-access-wrrzk\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.097286 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.098102 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.598086711 +0000 UTC m=+169.357538994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.100537 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config" (OuterVolumeSpecName: "config") pod "b78bcc96-a38d-4734-8b94-25cddc46b289" (UID: "b78bcc96-a38d-4734-8b94-25cddc46b289"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.102398 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca" (OuterVolumeSpecName: "client-ca") pod "b78bcc96-a38d-4734-8b94-25cddc46b289" (UID: "b78bcc96-a38d-4734-8b94-25cddc46b289"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.105634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b78bcc96-a38d-4734-8b94-25cddc46b289" (UID: "b78bcc96-a38d-4734-8b94-25cddc46b289"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.108893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6" (OuterVolumeSpecName: "kube-api-access-pmmg6") pod "b78bcc96-a38d-4734-8b94-25cddc46b289" (UID: "b78bcc96-a38d-4734-8b94-25cddc46b289"). InnerVolumeSpecName "kube-api-access-pmmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.128140 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:54 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:54 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:54 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.128209 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.198113 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.198474 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.198494 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b78bcc96-a38d-4734-8b94-25cddc46b289-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.198506 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmmg6\" (UniqueName: \"kubernetes.io/projected/b78bcc96-a38d-4734-8b94-25cddc46b289-kube-api-access-pmmg6\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.198516 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b78bcc96-a38d-4734-8b94-25cddc46b289-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.198603 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.698572991 +0000 UTC m=+169.458025294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.232497 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qghn4" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.301156 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.301522 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.801510794 +0000 UTC m=+169.560963077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.349163 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.349461 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6bvf8"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.402172 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.402560 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:54.902542192 +0000 UTC m=+169.661994475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.415184 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.503553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.503809 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.003798759 +0000 UTC m=+169.763251032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.556361 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53688: no serving certificate available for the kubelet" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.603750 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.603961 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerName="controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.603972 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerName="controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.603989 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerName="route-controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.603995 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerName="route-controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.604074 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" containerName="route-controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.604085 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" containerName="controller-manager" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.604207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.604352 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.104336201 +0000 UTC m=+169.863788484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.604523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.604814 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.104802007 +0000 UTC m=+169.864254290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.604933 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.608299 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.643814 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.709482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.709651 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.209626175 +0000 UTC m=+169.969078458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.709715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.709794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqdp\" (UniqueName: \"kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.709884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.709917 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.709991 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.209983347 +0000 UTC m=+169.969435630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.739369 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.742459 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.748913 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.749582 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.750921 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.751156 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.751273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.751753 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.751822 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.754325 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.755986 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.758840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.761393 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.784333 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.785234 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.791031 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.803407 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813252 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.813385 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.313360167 +0000 UTC m=+170.072812450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813795 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jzqv\" (UniqueName: \"kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813911 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.813931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814218 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814270 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqh7\" (UniqueName: \"kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.814563 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.314552998 +0000 UTC m=+170.074005281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqdp\" (UniqueName: \"kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.814886 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.815283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.815332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.825542 4755 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.846932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqdp\" (UniqueName: \"kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp\") pod \"community-operators-dwldz\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916343 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916660 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jzqv\" (UniqueName: \"kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916696 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4wc\" (UniqueName: \"kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqh7\" (UniqueName: \"kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.916851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.917662 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: E0317 00:24:54.917734 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.417721459 +0000 UTC m=+170.177173742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.918799 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.919785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.920752 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.920943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.921069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.925270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.930645 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.934804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqh7\" (UniqueName: \"kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7\") pod \"route-controller-manager-698ff8b74-z89z7\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.938277 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jzqv\" (UniqueName: \"kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv\") pod \"controller-manager-74dbcdff6-jcppl\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.981531 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.982463 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:54 crc kubenswrapper[4755]: I0317 00:24:54.991738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.002846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" event={"ID":"1347078a-8c1c-4e33-aee1-8aaad00829f2","Type":"ContainerStarted","Data":"87b6bff636b46ba7708698facb3763f32400731f19c6de3f85e954b388ff6f1a"} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.002890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" event={"ID":"1347078a-8c1c-4e33-aee1-8aaad00829f2","Type":"ContainerStarted","Data":"271f0e43b967a105763ca118e7bdbfd47a98dc7acf17dbd19fce11a36bcaae47"} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.006451 4755 generic.go:334] "Generic (PLEG): container finished" podID="cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" containerID="70e4c1def5018c3b2f625393e2a282b6b3c37f06d9f5aa26002edce2d6f2d7e1" exitCode=0 Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.006529 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" event={"ID":"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c","Type":"ContainerDied","Data":"70e4c1def5018c3b2f625393e2a282b6b3c37f06d9f5aa26002edce2d6f2d7e1"} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.014339 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" event={"ID":"b78bcc96-a38d-4734-8b94-25cddc46b289","Type":"ContainerDied","Data":"dfb37be85a662dd1f92fb5c50eae01d031b7d4f32345288cd9193c6a061eed20"} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.014389 4755 scope.go:117] "RemoveContainer" containerID="1ca385a8f40d698a02a08560401c13b537c243c8fdb2ec16501fa1e1aaa88ba2" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.014519 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.018190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.018238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.018275 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4wc\" (UniqueName: \"kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.018353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.021638 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.521619946 +0000 UTC m=+170.281072229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.022179 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.022383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.029902 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" event={"ID":"1c0bace3-bc39-44fa-9f39-d8f18c07675d","Type":"ContainerStarted","Data":"3defc82eaffbae98e661538e9f1c16728911cb56d07bc6f0032d5915fa9b5dc9"} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.029968 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.037199 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.039836 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.042677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4wc\" (UniqueName: \"kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc\") pod \"certified-operators-m967j\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.045638 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tjxpj"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.061632 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" podStartSLOduration=109.061616039 podStartE2EDuration="1m49.061616039s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:55.059995694 +0000 UTC m=+169.819447997" watchObservedRunningTime="2026-03-17 00:24:55.061616039 +0000 UTC m=+169.821068322" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.067158 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.086766 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.100010 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.115404 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:55 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:55 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:55 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.115463 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.120549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.120763 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.620735998 +0000 UTC m=+170.380188281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.120974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h8bd\" (UniqueName: \"kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.121031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.121209 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.121543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.121895 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.621879168 +0000 UTC m=+170.381331451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.178150 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.179209 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.189784 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.222899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.223082 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.723056371 +0000 UTC m=+170.482508654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223353 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpd28\" (UniqueName: \"kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h8bd\" (UniqueName: \"kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223955 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.223999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.224038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.224157 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.724146349 +0000 UTC m=+170.483598632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.224526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.226717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.242842 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h8bd\" (UniqueName: \"kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd\") pod \"community-operators-ct6rl\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.298307 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.325474 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.325580 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.825563491 +0000 UTC m=+170.585015774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.325826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.325911 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.325939 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpd28\" (UniqueName: \"kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.325978 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.326209 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.826202003 +0000 UTC m=+170.585654286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hg2fb" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.326647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.326863 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.343224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpd28\" (UniqueName: \"kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28\") pod \"certified-operators-v9fxv\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.399613 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.427087 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:55 crc kubenswrapper[4755]: E0317 00:24:55.427482 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 00:24:55.927464769 +0000 UTC m=+170.686917052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.430518 4755 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-17T00:24:54.825571746Z","Handler":null,"Name":""} Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.434935 4755 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.434973 4755 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.446764 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.447480 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.459154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.459247 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.475939 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.522005 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.531218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.531546 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.531779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.534720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.535739 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.535777 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: W0317 00:24:55.539755 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe55626c_4d34_4b09_83b0_897cd661216a.slice/crio-be322b4f8f6db9468e86441bb73a57ec43d85c2302c1e83fe1396337b4f667d7 WatchSource:0}: Error finding container be322b4f8f6db9468e86441bb73a57ec43d85c2302c1e83fe1396337b4f667d7: Status 404 returned error can't find the container with id be322b4f8f6db9468e86441bb73a57ec43d85c2302c1e83fe1396337b4f667d7 Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.540249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.560212 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hg2fb\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.609883 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.618577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.633205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.633477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.633545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.633623 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.644834 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.651597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.715091 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.753906 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.774722 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:55 crc kubenswrapper[4755]: I0317 00:24:55.928758 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:24:55 crc kubenswrapper[4755]: W0317 00:24:55.984668 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79a4ec38_af00_43b6_bd22_d3ff75e52d71.slice/crio-91095fea73cf220a693002c8a8b9db3510f3f4da47abc5183096c5bf33b91f8a WatchSource:0}: Error finding container 91095fea73cf220a693002c8a8b9db3510f3f4da47abc5183096c5bf33b91f8a: Status 404 returned error can't find the container with id 91095fea73cf220a693002c8a8b9db3510f3f4da47abc5183096c5bf33b91f8a Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.071156 4755 generic.go:334] "Generic (PLEG): container finished" podID="be55626c-4d34-4b09-83b0-897cd661216a" containerID="6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec" exitCode=0 Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.071272 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerDied","Data":"6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.071316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerStarted","Data":"be322b4f8f6db9468e86441bb73a57ec43d85c2302c1e83fe1396337b4f667d7"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.087647 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerStarted","Data":"b74023147a2a8d0c12bd5643bd1b221a7ac6835632315149b323ec6c65902259"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.095546 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.113127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" event={"ID":"1347078a-8c1c-4e33-aee1-8aaad00829f2","Type":"ContainerStarted","Data":"df04ca1f24091308e4c7eabf2822b01ab5ff2f0c57dc64f82d9b077c3a0ff811"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.117978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" event={"ID":"79a4ec38-af00-43b6-bd22-d3ff75e52d71","Type":"ContainerStarted","Data":"91095fea73cf220a693002c8a8b9db3510f3f4da47abc5183096c5bf33b91f8a"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.120823 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:56 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:56 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:56 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.120866 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.121322 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1339920-3dec-4332-9749-ec66520252cb" containerID="2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20" exitCode=0 Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.121403 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerDied","Data":"2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.121432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerStarted","Data":"72a363294e8a77273c5f3716c52af17809feee439b2ec7a15a83c590aa0e935d"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.123372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" event={"ID":"d4b08294-0e03-43e4-992b-2ce2de6f3b0d","Type":"ContainerStarted","Data":"53f21d1debb2489327ecdc9f47e761cc5bc1e1811d650204b17d21716d67f19e"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.123399 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" event={"ID":"d4b08294-0e03-43e4-992b-2ce2de6f3b0d","Type":"ContainerStarted","Data":"a243de40e56b36db383b3edd0dbd4777551621eb10f23fe5a70c865dc58f6f28"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.125086 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.145093 4755 generic.go:334] "Generic (PLEG): container finished" podID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerID="f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7" exitCode=0 Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.145194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerDied","Data":"f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.145225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerStarted","Data":"fd136e1a7396c88bc324998bfee96e462350cc777ad0a5a1d35b1a03f938b441"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.150794 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-d2qpt" podStartSLOduration=11.150779011 podStartE2EDuration="11.150779011s" podCreationTimestamp="2026-03-17 00:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:56.142236568 +0000 UTC m=+170.901688851" watchObservedRunningTime="2026-03-17 00:24:56.150779011 +0000 UTC m=+170.910231294" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.153994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" event={"ID":"1a5da40d-fd23-4d0b-acdc-b446e6df9b26","Type":"ContainerStarted","Data":"3c00459cd82c503abc641d29d335de0d72c85b6b1fc799f099951c80aa5db620"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.154025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.154033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" event={"ID":"1a5da40d-fd23-4d0b-acdc-b446e6df9b26","Type":"ContainerStarted","Data":"85557773208bb56aaa542d8260c07b511e8b9e31bd7c29b5d76af214f7d58eab"} Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.163849 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.184899 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" podStartSLOduration=3.184881411 podStartE2EDuration="3.184881411s" podCreationTimestamp="2026-03-17 00:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:56.183930859 +0000 UTC m=+170.943383142" watchObservedRunningTime="2026-03-17 00:24:56.184881411 +0000 UTC m=+170.944333694" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.272452 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:24:56 crc kubenswrapper[4755]: E0317 00:24:56.272950 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.297601 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" podStartSLOduration=3.297576581 podStartE2EDuration="3.297576581s" podCreationTimestamp="2026-03-17 00:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:56.260750267 +0000 UTC m=+171.020202550" watchObservedRunningTime="2026-03-17 00:24:56.297576581 +0000 UTC m=+171.057028864" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.333918 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.334601 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78bcc96-a38d-4734-8b94-25cddc46b289" path="/var/lib/kubelet/pods/b78bcc96-a38d-4734-8b94-25cddc46b289/volumes" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.335467 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b" path="/var/lib/kubelet/pods/ca30bd23-64c8-4399-bb0d-cbe5a92a2b5b/volumes" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.409744 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.524708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.530970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.533808 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lc64n" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.585068 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:24:56 crc kubenswrapper[4755]: E0317 00:24:56.585522 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" containerName="collect-profiles" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.585592 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" containerName="collect-profiles" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.585819 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" containerName="collect-profiles" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.586614 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.589086 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.626323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663185 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") pod \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663282 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") pod \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc\") pod \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\" (UID: \"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c\") " Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.663850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw768\" (UniqueName: \"kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.669658 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" (UID: "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.677004 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc" (OuterVolumeSpecName: "kube-api-access-qwflc") pod "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" (UID: "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c"). InnerVolumeSpecName "kube-api-access-qwflc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.698336 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" (UID: "cfcd13b3-4ede-42eb-8b04-b2d572f7f64c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.767785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw768\" (UniqueName: \"kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.767930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768047 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768059 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-kube-api-access-qwflc\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768070 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768470 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.768603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.787944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw768\" (UniqueName: \"kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768\") pod \"redhat-marketplace-pvz7c\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.912886 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.979680 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:24:56 crc kubenswrapper[4755]: I0317 00:24:56.980810 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.005126 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.072013 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.072356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.072393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx958\" (UniqueName: \"kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.126025 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:57 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:57 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:57 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.126069 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.160388 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53704: no serving certificate available for the kubelet" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.169107 4755 generic.go:334] "Generic (PLEG): container finished" podID="1089c661-4f7b-40e2-8549-983b06c1409a" containerID="f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000" exitCode=0 Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.169219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerDied","Data":"f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000"} Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.174387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" event={"ID":"79a4ec38-af00-43b6-bd22-d3ff75e52d71","Type":"ContainerStarted","Data":"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e"} Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.175520 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.176377 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.176417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx958\" (UniqueName: \"kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.176492 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.176901 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.177195 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.188152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a022b5d-835e-4615-8438-4a837c38132f","Type":"ContainerStarted","Data":"7521dd06ba85cac57b77637d83041ded77966dc58aa8f466e412439f5cfbf4b2"} Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.188938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a022b5d-835e-4615-8438-4a837c38132f","Type":"ContainerStarted","Data":"528da09f6fd4c117915515c14d28e648f0492355062eccb5107e3df91d276533"} Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.194515 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" podStartSLOduration=111.194497942 podStartE2EDuration="1m51.194497942s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:24:57.194386938 +0000 UTC m=+171.953839231" watchObservedRunningTime="2026-03-17 00:24:57.194497942 +0000 UTC m=+171.953950225" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.207776 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx958\" (UniqueName: \"kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958\") pod \"redhat-marketplace-frcfc\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.229277 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" event={"ID":"cfcd13b3-4ede-42eb-8b04-b2d572f7f64c","Type":"ContainerDied","Data":"bdca036085df50bb592873f4f9e4b6b2c6f7539089889ae64ddb1b8f241345b3"} Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.229631 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdca036085df50bb592873f4f9e4b6b2c6f7539089889ae64ddb1b8f241345b3" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.247158 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.282285 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.292054 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4wtw" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.311019 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.642234 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-8gjxd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.642567 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8gjxd" podUID="64e28e17-1dd0-401e-9b26-7eefc1c54f5f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.642312 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-8gjxd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.642905 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8gjxd" podUID="64e28e17-1dd0-401e-9b26-7eefc1c54f5f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.712693 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:24:57 crc kubenswrapper[4755]: W0317 00:24:57.744350 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e74213a_9bd5_440c_b207_5218feab7323.slice/crio-2ce7c7ce1620105542aa193e38515de933f2778cfd24c6a12eab3db9e40270be WatchSource:0}: Error finding container 2ce7c7ce1620105542aa193e38515de933f2778cfd24c6a12eab3db9e40270be: Status 404 returned error can't find the container with id 2ce7c7ce1620105542aa193e38515de933f2778cfd24c6a12eab3db9e40270be Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.968114 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.969243 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.973073 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.974101 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.981139 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.983422 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.984758 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.988017 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.990937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:57 crc kubenswrapper[4755]: I0317 00:24:57.991245 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.003381 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.092941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.093043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zttl\" (UniqueName: \"kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.093066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.093096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.093153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.093233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.110373 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.113816 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:58 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:58 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:58 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.113889 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.132967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.194071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zttl\" (UniqueName: \"kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.194152 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.194695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.195105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.196964 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.223390 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zttl\" (UniqueName: \"kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl\") pod \"redhat-operators-5ctv9\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: E0317 00:24:58.289907 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.291867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.291912 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.292190 4755 patch_prober.go:28] interesting pod/console-f9d7485db-2fcmx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.292247 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2fcmx" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.293708 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e74213a-9bd5-440c-b207-5218feab7323" containerID="17b33ef72c33630f68823cb285a635c7a409fd5e0f5bed2d183429f4dddea32f" exitCode=0 Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.293956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerDied","Data":"17b33ef72c33630f68823cb285a635c7a409fd5e0f5bed2d183429f4dddea32f"} Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.294000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerStarted","Data":"2ce7c7ce1620105542aa193e38515de933f2778cfd24c6a12eab3db9e40270be"} Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.298359 4755 generic.go:334] "Generic (PLEG): container finished" podID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerID="20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91" exitCode=0 Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.298474 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerDied","Data":"20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91"} Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.298527 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerStarted","Data":"1bb668ca2abb9cd31d98831ea0eb6fdca06888ba41056fbfb60fc61109a78ec3"} Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.302428 4755 generic.go:334] "Generic (PLEG): container finished" podID="6a022b5d-835e-4615-8438-4a837c38132f" containerID="7521dd06ba85cac57b77637d83041ded77966dc58aa8f466e412439f5cfbf4b2" exitCode=0 Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.303165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a022b5d-835e-4615-8438-4a837c38132f","Type":"ContainerDied","Data":"7521dd06ba85cac57b77637d83041ded77966dc58aa8f466e412439f5cfbf4b2"} Mar 17 00:24:58 crc kubenswrapper[4755]: E0317 00:24:58.303225 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:24:58 crc kubenswrapper[4755]: E0317 00:24:58.305040 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:24:58 crc kubenswrapper[4755]: E0317 00:24:58.305072 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.319957 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.328251 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.377924 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.381290 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.384281 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.398074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.398223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.398274 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxsp\" (UniqueName: \"kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.499364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.499759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.499796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxsp\" (UniqueName: \"kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.500948 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.501167 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.530721 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxsp\" (UniqueName: \"kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp\") pod \"redhat-operators-zxkzg\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.612818 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.709521 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.711142 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.725841 4755 ???:1] "http: TLS handshake error from 192.168.126.11:53714: no serving certificate available for the kubelet" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.809724 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access\") pod \"6a022b5d-835e-4615-8438-4a837c38132f\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.809829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir\") pod \"6a022b5d-835e-4615-8438-4a837c38132f\" (UID: \"6a022b5d-835e-4615-8438-4a837c38132f\") " Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.810008 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a022b5d-835e-4615-8438-4a837c38132f" (UID: "6a022b5d-835e-4615-8438-4a837c38132f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.810573 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a022b5d-835e-4615-8438-4a837c38132f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.824729 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a022b5d-835e-4615-8438-4a837c38132f" (UID: "6a022b5d-835e-4615-8438-4a837c38132f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.877697 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.912832 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a022b5d-835e-4615-8438-4a837c38132f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:24:58 crc kubenswrapper[4755]: I0317 00:24:58.949346 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 00:24:59 crc kubenswrapper[4755]: I0317 00:24:59.114385 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:24:59 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:24:59 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:24:59 crc kubenswrapper[4755]: healthz check failed Mar 17 00:24:59 crc kubenswrapper[4755]: I0317 00:24:59.114446 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:24:59 crc kubenswrapper[4755]: I0317 00:24:59.323543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6a022b5d-835e-4615-8438-4a837c38132f","Type":"ContainerDied","Data":"528da09f6fd4c117915515c14d28e648f0492355062eccb5107e3df91d276533"} Mar 17 00:24:59 crc kubenswrapper[4755]: I0317 00:24:59.323612 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528da09f6fd4c117915515c14d28e648f0492355062eccb5107e3df91d276533" Mar 17 00:24:59 crc kubenswrapper[4755]: I0317 00:24:59.323557 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 00:25:00 crc kubenswrapper[4755]: I0317 00:25:00.113147 4755 patch_prober.go:28] interesting pod/router-default-5444994796-7fxgg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 00:25:00 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 17 00:25:00 crc kubenswrapper[4755]: [+]process-running ok Mar 17 00:25:00 crc kubenswrapper[4755]: healthz check failed Mar 17 00:25:00 crc kubenswrapper[4755]: I0317 00:25:00.113202 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7fxgg" podUID="ce7b9c49-d9eb-4b27-aa3a-3bbd2323dfd8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 00:25:01 crc kubenswrapper[4755]: I0317 00:25:01.113236 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:25:01 crc kubenswrapper[4755]: I0317 00:25:01.116092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7fxgg" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.156721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.156808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.156860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.162740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.165247 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.182552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.259031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.259098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.262470 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7291e3d-2994-409e-972a-59394140b3ad-metrics-certs\") pod \"network-metrics-daemon-4v74b\" (UID: \"f7291e3d-2994-409e-972a-59394140b3ad\") " pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.264152 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.308174 4755 ???:1] "http: TLS handshake error from 192.168.126.11:39092: no serving certificate available for the kubelet" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.364765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.368930 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.375955 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 00:25:02 crc kubenswrapper[4755]: I0317 00:25:02.461803 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4v74b" Mar 17 00:25:03 crc kubenswrapper[4755]: I0317 00:25:03.260513 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w595z" Mar 17 00:25:03 crc kubenswrapper[4755]: I0317 00:25:03.351714 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-4b88z_18736749-34d6-4ce5-a0ff-e8af0ca22cdc/cluster-samples-operator/0.log" Mar 17 00:25:03 crc kubenswrapper[4755]: I0317 00:25:03.351756 4755 generic.go:334] "Generic (PLEG): container finished" podID="18736749-34d6-4ce5-a0ff-e8af0ca22cdc" containerID="56d7917af71761cfa22d2729672069694bd1b03f9137406d7fd9e357290d6c55" exitCode=2 Mar 17 00:25:03 crc kubenswrapper[4755]: I0317 00:25:03.351782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" event={"ID":"18736749-34d6-4ce5-a0ff-e8af0ca22cdc","Type":"ContainerDied","Data":"56d7917af71761cfa22d2729672069694bd1b03f9137406d7fd9e357290d6c55"} Mar 17 00:25:03 crc kubenswrapper[4755]: I0317 00:25:03.352233 4755 scope.go:117] "RemoveContainer" containerID="56d7917af71761cfa22d2729672069694bd1b03f9137406d7fd9e357290d6c55" Mar 17 00:25:05 crc kubenswrapper[4755]: I0317 00:25:05.365203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerStarted","Data":"f3bef55f66ff690d30b0c2faf490b12cd17b5500a1467ef1c7e83d5c12e4db6d"} Mar 17 00:25:07 crc kubenswrapper[4755]: I0317 00:25:07.400981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8138043e-c956-468a-8b5d-4627d3d4a8ea","Type":"ContainerStarted","Data":"90b4a3cb0e95ac868cb603ce5736f0e05f13b32edf287fd2e1ffee6ab1175b3a"} Mar 17 00:25:07 crc kubenswrapper[4755]: I0317 00:25:07.654385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8gjxd" Mar 17 00:25:08 crc kubenswrapper[4755]: I0317 00:25:08.248208 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:25:08 crc kubenswrapper[4755]: E0317 00:25:08.248622 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 00:25:08 crc kubenswrapper[4755]: E0317 00:25:08.286271 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:08 crc kubenswrapper[4755]: E0317 00:25:08.288866 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:08 crc kubenswrapper[4755]: E0317 00:25:08.290054 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:08 crc kubenswrapper[4755]: E0317 00:25:08.290119 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:25:08 crc kubenswrapper[4755]: I0317 00:25:08.299887 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:25:08 crc kubenswrapper[4755]: I0317 00:25:08.303627 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:25:12 crc kubenswrapper[4755]: I0317 00:25:12.972078 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:25:12 crc kubenswrapper[4755]: I0317 00:25:12.972738 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerName="controller-manager" containerID="cri-o://3c00459cd82c503abc641d29d335de0d72c85b6b1fc799f099951c80aa5db620" gracePeriod=30 Mar 17 00:25:13 crc kubenswrapper[4755]: I0317 00:25:13.004630 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:25:13 crc kubenswrapper[4755]: I0317 00:25:13.004957 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerName="route-controller-manager" containerID="cri-o://53f21d1debb2489327ecdc9f47e761cc5bc1e1811d650204b17d21716d67f19e" gracePeriod=30 Mar 17 00:25:14 crc kubenswrapper[4755]: E0317 00:25:14.346794 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 17 00:25:14 crc kubenswrapper[4755]: E0317 00:25:14.347289 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 00:25:14 crc kubenswrapper[4755]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 17 00:25:14 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvlsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29561784-zznp2_openshift-infra(d46784e1-420c-4d3b-aca7-65271a898c44): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 17 00:25:14 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 17 00:25:14 crc kubenswrapper[4755]: E0317 00:25:14.348522 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29561784-zznp2" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" Mar 17 00:25:14 crc kubenswrapper[4755]: I0317 00:25:14.439755 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerID="53f21d1debb2489327ecdc9f47e761cc5bc1e1811d650204b17d21716d67f19e" exitCode=0 Mar 17 00:25:14 crc kubenswrapper[4755]: I0317 00:25:14.439820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" event={"ID":"d4b08294-0e03-43e4-992b-2ce2de6f3b0d","Type":"ContainerDied","Data":"53f21d1debb2489327ecdc9f47e761cc5bc1e1811d650204b17d21716d67f19e"} Mar 17 00:25:14 crc kubenswrapper[4755]: I0317 00:25:14.441467 4755 generic.go:334] "Generic (PLEG): container finished" podID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerID="3c00459cd82c503abc641d29d335de0d72c85b6b1fc799f099951c80aa5db620" exitCode=0 Mar 17 00:25:14 crc kubenswrapper[4755]: I0317 00:25:14.441597 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" event={"ID":"1a5da40d-fd23-4d0b-acdc-b446e6df9b26","Type":"ContainerDied","Data":"3c00459cd82c503abc641d29d335de0d72c85b6b1fc799f099951c80aa5db620"} Mar 17 00:25:14 crc kubenswrapper[4755]: E0317 00:25:14.442900 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29561784-zznp2" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" Mar 17 00:25:15 crc kubenswrapper[4755]: I0317 00:25:15.068923 4755 patch_prober.go:28] interesting pod/controller-manager-74dbcdff6-jcppl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 17 00:25:15 crc kubenswrapper[4755]: I0317 00:25:15.068975 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 17 00:25:15 crc kubenswrapper[4755]: I0317 00:25:15.093177 4755 patch_prober.go:28] interesting pod/route-controller-manager-698ff8b74-z89z7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 17 00:25:15 crc kubenswrapper[4755]: I0317 00:25:15.093232 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 17 00:25:15 crc kubenswrapper[4755]: I0317 00:25:15.623417 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:25:17 crc kubenswrapper[4755]: E0317 00:25:17.290770 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 00:25:17 crc kubenswrapper[4755]: E0317 00:25:17.291185 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5s4wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m967j_openshift-marketplace(be55626c-4d34-4b09-83b0-897cd661216a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 00:25:17 crc kubenswrapper[4755]: E0317 00:25:17.292520 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m967j" podUID="be55626c-4d34-4b09-83b0-897cd661216a" Mar 17 00:25:18 crc kubenswrapper[4755]: E0317 00:25:18.288837 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:18 crc kubenswrapper[4755]: E0317 00:25:18.291177 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:18 crc kubenswrapper[4755]: E0317 00:25:18.292803 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 17 00:25:18 crc kubenswrapper[4755]: E0317 00:25:18.292886 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.187194 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m967j" podUID="be55626c-4d34-4b09-83b0-897cd661216a" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.255147 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.255633 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2h8bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ct6rl_openshift-marketplace(a23b6da4-12f1-4104-93ba-dfc06d3572aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.256807 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ct6rl" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.286356 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.286510 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpd28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v9fxv_openshift-marketplace(1089c661-4f7b-40e2-8549-983b06c1409a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.287772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v9fxv" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.302606 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.302704 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btqdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dwldz_openshift-marketplace(e1339920-3dec-4332-9749-ec66520252cb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 00:25:19 crc kubenswrapper[4755]: E0317 00:25:19.303834 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dwldz" podUID="e1339920-3dec-4332-9749-ec66520252cb" Mar 17 00:25:20 crc kubenswrapper[4755]: E0317 00:25:20.798420 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dwldz" podUID="e1339920-3dec-4332-9749-ec66520252cb" Mar 17 00:25:20 crc kubenswrapper[4755]: E0317 00:25:20.798712 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ct6rl" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" Mar 17 00:25:20 crc kubenswrapper[4755]: E0317 00:25:20.798815 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v9fxv" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.925290 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.965198 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:20 crc kubenswrapper[4755]: E0317 00:25:20.965564 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a022b5d-835e-4615-8438-4a837c38132f" containerName="pruner" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.965588 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a022b5d-835e-4615-8438-4a837c38132f" containerName="pruner" Mar 17 00:25:20 crc kubenswrapper[4755]: E0317 00:25:20.965610 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerName="route-controller-manager" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.965620 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerName="route-controller-manager" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.965797 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" containerName="route-controller-manager" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.965846 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a022b5d-835e-4615-8438-4a837c38132f" containerName="pruner" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.966366 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:20 crc kubenswrapper[4755]: I0317 00:25:20.968973 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.051063 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca\") pod \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.051102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert\") pod \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.051169 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config\") pod \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.051200 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdqh7\" (UniqueName: \"kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7\") pod \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\" (UID: \"d4b08294-0e03-43e4-992b-2ce2de6f3b0d\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.052628 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config" (OuterVolumeSpecName: "config") pod "d4b08294-0e03-43e4-992b-2ce2de6f3b0d" (UID: "d4b08294-0e03-43e4-992b-2ce2de6f3b0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.052920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4b08294-0e03-43e4-992b-2ce2de6f3b0d" (UID: "d4b08294-0e03-43e4-992b-2ce2de6f3b0d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.056060 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7" (OuterVolumeSpecName: "kube-api-access-hdqh7") pod "d4b08294-0e03-43e4-992b-2ce2de6f3b0d" (UID: "d4b08294-0e03-43e4-992b-2ce2de6f3b0d"). InnerVolumeSpecName "kube-api-access-hdqh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.056299 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4b08294-0e03-43e4-992b-2ce2de6f3b0d" (UID: "d4b08294-0e03-43e4-992b-2ce2de6f3b0d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.071130 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152061 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7c2\" (UniqueName: \"kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152250 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152261 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152270 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdqh7\" (UniqueName: \"kubernetes.io/projected/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-kube-api-access-hdqh7\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.152279 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b08294-0e03-43e4-992b-2ce2de6f3b0d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.248837 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.252475 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jzqv\" (UniqueName: \"kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv\") pod \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.252543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca\") pod \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.252605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert\") pod \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261256 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a5da40d-fd23-4d0b-acdc-b446e6df9b26" (UID: "1a5da40d-fd23-4d0b-acdc-b446e6df9b26"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles\") pod \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261604 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config\") pod \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\" (UID: \"1a5da40d-fd23-4d0b-acdc-b446e6df9b26\") " Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261758 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7c2\" (UniqueName: \"kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.261952 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.262295 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config" (OuterVolumeSpecName: "config") pod "1a5da40d-fd23-4d0b-acdc-b446e6df9b26" (UID: "1a5da40d-fd23-4d0b-acdc-b446e6df9b26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.262860 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.262930 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1a5da40d-fd23-4d0b-acdc-b446e6df9b26" (UID: "1a5da40d-fd23-4d0b-acdc-b446e6df9b26"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.263174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.265742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1a5da40d-fd23-4d0b-acdc-b446e6df9b26" (UID: "1a5da40d-fd23-4d0b-acdc-b446e6df9b26"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.265820 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv" (OuterVolumeSpecName: "kube-api-access-4jzqv") pod "1a5da40d-fd23-4d0b-acdc-b446e6df9b26" (UID: "1a5da40d-fd23-4d0b-acdc-b446e6df9b26"). InnerVolumeSpecName "kube-api-access-4jzqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.266317 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.282232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7c2\" (UniqueName: \"kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2\") pod \"route-controller-manager-6dd4f49fc8-qvlqz\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.294030 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-89b50b64f8b25a892b93fa412dd63291c24767e83bb548f84a9d7d76f815783b WatchSource:0}: Error finding container 89b50b64f8b25a892b93fa412dd63291c24767e83bb548f84a9d7d76f815783b: Status 404 returned error can't find the container with id 89b50b64f8b25a892b93fa412dd63291c24767e83bb548f84a9d7d76f815783b Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.343370 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4v74b"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.361640 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.362994 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-217cd2dd60821fa1a00d200d9c8a1e6281b3b045fc83099fd165e5c918a358f1 WatchSource:0}: Error finding container 217cd2dd60821fa1a00d200d9c8a1e6281b3b045fc83099fd165e5c918a358f1: Status 404 returned error can't find the container with id 217cd2dd60821fa1a00d200d9c8a1e6281b3b045fc83099fd165e5c918a358f1 Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.363546 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.363566 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.363575 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.363584 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jzqv\" (UniqueName: \"kubernetes.io/projected/1a5da40d-fd23-4d0b-acdc-b446e6df9b26-kube-api-access-4jzqv\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.405734 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-5b9642275c01ee042ac4fbec5a36199fcacda174e4144ead0b216b400c300bd6 WatchSource:0}: Error finding container 5b9642275c01ee042ac4fbec5a36199fcacda174e4144ead0b216b400c300bd6: Status 404 returned error can't find the container with id 5b9642275c01ee042ac4fbec5a36199fcacda174e4144ead0b216b400c300bd6 Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.407174 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bffbce_4004_4c41_9711_28fa7700e14a.slice/crio-62e35dd821f493aa17787d49afdaa279cbbc4f05306aea1edfca40ba3a436440 WatchSource:0}: Error finding container 62e35dd821f493aa17787d49afdaa279cbbc4f05306aea1edfca40ba3a436440: Status 404 returned error can't find the container with id 62e35dd821f493aa17787d49afdaa279cbbc4f05306aea1edfca40ba3a436440 Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.411217 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7291e3d_2994_409e_972a_59394140b3ad.slice/crio-693a9fa8222c1f0fa195ff3c13fe238d3760f656652658b47b9812c98df02975 WatchSource:0}: Error finding container 693a9fa8222c1f0fa195ff3c13fe238d3760f656652658b47b9812c98df02975: Status 404 returned error can't find the container with id 693a9fa8222c1f0fa195ff3c13fe238d3760f656652658b47b9812c98df02975 Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.421044 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.480704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerStarted","Data":"62e35dd821f493aa17787d49afdaa279cbbc4f05306aea1edfca40ba3a436440"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.483800 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e74213a-9bd5-440c-b207-5218feab7323" containerID="c0dc613d158f33a95413d8894c2fc15cf7bc05df1bf6c0c2d2852c6ef28f4956" exitCode=0 Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.483861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerDied","Data":"c0dc613d158f33a95413d8894c2fc15cf7bc05df1bf6c0c2d2852c6ef28f4956"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.491404 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-4b88z_18736749-34d6-4ce5-a0ff-e8af0ca22cdc/cluster-samples-operator/0.log" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.491622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4b88z" event={"ID":"18736749-34d6-4ce5-a0ff-e8af0ca22cdc","Type":"ContainerStarted","Data":"b485ec1671d9bb16eacdf1229bc3f8cfddc6d9ea3cba89ac7143d421226e6567"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.494056 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.494058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dbcdff6-jcppl" event={"ID":"1a5da40d-fd23-4d0b-acdc-b446e6df9b26","Type":"ContainerDied","Data":"85557773208bb56aaa542d8260c07b511e8b9e31bd7c29b5d76af214f7d58eab"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.494108 4755 scope.go:117] "RemoveContainer" containerID="3c00459cd82c503abc641d29d335de0d72c85b6b1fc799f099951c80aa5db620" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.515688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5b9642275c01ee042ac4fbec5a36199fcacda174e4144ead0b216b400c300bd6"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.530055 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8138043e-c956-468a-8b5d-4627d3d4a8ea","Type":"ContainerStarted","Data":"2017f5e3c9934bf33cdc987265dc32a56fd92de2b67219a07a4c7a6e4e8e34f4"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.534041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4v74b" event={"ID":"f7291e3d-2994-409e-972a-59394140b3ad","Type":"ContainerStarted","Data":"693a9fa8222c1f0fa195ff3c13fe238d3760f656652658b47b9812c98df02975"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.536894 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.540090 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74dbcdff6-jcppl"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.549092 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6d62af328c72a6d33c453871137918b876559575e2b5e7b095d664b21b051e1a"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.549127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"89b50b64f8b25a892b93fa412dd63291c24767e83bb548f84a9d7d76f815783b"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.549572 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=24.54955402 podStartE2EDuration="24.54955402s" podCreationTimestamp="2026-03-17 00:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:21.548839262 +0000 UTC m=+196.308291545" watchObservedRunningTime="2026-03-17 00:25:21.54955402 +0000 UTC m=+196.309006303" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.550862 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.552053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7" event={"ID":"d4b08294-0e03-43e4-992b-2ce2de6f3b0d","Type":"ContainerDied","Data":"a243de40e56b36db383b3edd0dbd4777551621eb10f23fe5a70c865dc58f6f28"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.552104 4755 scope.go:117] "RemoveContainer" containerID="53f21d1debb2489327ecdc9f47e761cc5bc1e1811d650204b17d21716d67f19e" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.558723 4755 generic.go:334] "Generic (PLEG): container finished" podID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerID="4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2" exitCode=0 Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.558804 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerDied","Data":"4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.582133 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d943f92592c6829829709eba0fa862a7179af44a1dff278b224cb5ae829cc6d2"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.582460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"217cd2dd60821fa1a00d200d9c8a1e6281b3b045fc83099fd165e5c918a358f1"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.582756 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.590983 4755 generic.go:334] "Generic (PLEG): container finished" podID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerID="586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5" exitCode=0 Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.591017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerDied","Data":"586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5"} Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.628375 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.654962 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698ff8b74-z89z7"] Mar 17 00:25:21 crc kubenswrapper[4755]: I0317 00:25:21.704160 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:21 crc kubenswrapper[4755]: W0317 00:25:21.766980 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfadefec_2153_48d4_bf07_56d7bd41e16c.slice/crio-326ca162c59ec29c95ae424de699590dabfb9bc495182b72c645bb0c64da663e WatchSource:0}: Error finding container 326ca162c59ec29c95ae424de699590dabfb9bc495182b72c645bb0c64da663e: Status 404 returned error can't find the container with id 326ca162c59ec29c95ae424de699590dabfb9bc495182b72c645bb0c64da663e Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.255841 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" path="/var/lib/kubelet/pods/1a5da40d-fd23-4d0b-acdc-b446e6df9b26/volumes" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.257042 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b08294-0e03-43e4-992b-2ce2de6f3b0d" path="/var/lib/kubelet/pods/d4b08294-0e03-43e4-992b-2ce2de6f3b0d/volumes" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.600870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2f20c155abb6e51ff1b0eee68dfa2e9bebe24e2684696c1015b72097a23f7303"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.607749 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4v74b" event={"ID":"f7291e3d-2994-409e-972a-59394140b3ad","Type":"ContainerStarted","Data":"a02135a4a870bfd4517b281304e44b29e70f719052eb84eaa9eaf710cebd8cc5"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.607790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4v74b" event={"ID":"f7291e3d-2994-409e-972a-59394140b3ad","Type":"ContainerStarted","Data":"0bf07e4ac5dd60df11ce5803c35978b34e51a7453077eb56220953bff361984a"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.613273 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerID="571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe" exitCode=0 Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.613812 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerDied","Data":"571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.643507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerStarted","Data":"190bdbda2c6fb5e0257bdb265cac5bff9827945807b377e0322f901ced61f4c5"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.651269 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.657861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.658564 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.659342 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4v74b" podStartSLOduration=136.659326788 podStartE2EDuration="2m16.659326788s" podCreationTimestamp="2026-03-17 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:22.658664 +0000 UTC m=+197.418116283" watchObservedRunningTime="2026-03-17 00:25:22.659326788 +0000 UTC m=+197.418779071" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.666410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerStarted","Data":"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.671238 4755 generic.go:334] "Generic (PLEG): container finished" podID="8138043e-c956-468a-8b5d-4627d3d4a8ea" containerID="2017f5e3c9934bf33cdc987265dc32a56fd92de2b67219a07a4c7a6e4e8e34f4" exitCode=0 Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.671281 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8138043e-c956-468a-8b5d-4627d3d4a8ea","Type":"ContainerDied","Data":"2017f5e3c9934bf33cdc987265dc32a56fd92de2b67219a07a4c7a6e4e8e34f4"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.672479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" event={"ID":"dfadefec-2153-48d4-bf07-56d7bd41e16c","Type":"ContainerStarted","Data":"001f82566432aafbb66eea4ba4e5f0c3c387342a1cb192cc2384b7868a118d62"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.672505 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" event={"ID":"dfadefec-2153-48d4-bf07-56d7bd41e16c","Type":"ContainerStarted","Data":"326ca162c59ec29c95ae424de699590dabfb9bc495182b72c645bb0c64da663e"} Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.672840 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.678004 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.711320 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=83.711302981 podStartE2EDuration="1m23.711302981s" podCreationTimestamp="2026-03-17 00:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:22.709214695 +0000 UTC m=+197.468666978" watchObservedRunningTime="2026-03-17 00:25:22.711302981 +0000 UTC m=+197.470755264" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.712412 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frcfc" podStartSLOduration=3.06517015 podStartE2EDuration="26.71240555s" podCreationTimestamp="2026-03-17 00:24:56 +0000 UTC" firstStartedPulling="2026-03-17 00:24:58.294800676 +0000 UTC m=+173.054252969" lastFinishedPulling="2026-03-17 00:25:21.942036086 +0000 UTC m=+196.701488369" observedRunningTime="2026-03-17 00:25:22.678264464 +0000 UTC m=+197.437716747" watchObservedRunningTime="2026-03-17 00:25:22.71240555 +0000 UTC m=+197.471857833" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.772688 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvz7c" podStartSLOduration=2.966799026 podStartE2EDuration="26.77266957s" podCreationTimestamp="2026-03-17 00:24:56 +0000 UTC" firstStartedPulling="2026-03-17 00:24:58.302417418 +0000 UTC m=+173.061869701" lastFinishedPulling="2026-03-17 00:25:22.108287962 +0000 UTC m=+196.867740245" observedRunningTime="2026-03-17 00:25:22.77189524 +0000 UTC m=+197.531347523" watchObservedRunningTime="2026-03-17 00:25:22.77266957 +0000 UTC m=+197.532121853" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.774244 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" podStartSLOduration=9.774237221 podStartE2EDuration="9.774237221s" podCreationTimestamp="2026-03-17 00:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:22.732753113 +0000 UTC m=+197.492205386" watchObservedRunningTime="2026-03-17 00:25:22.774237221 +0000 UTC m=+197.533689504" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.821056 4755 ???:1] "http: TLS handshake error from 192.168.126.11:44360: no serving certificate available for the kubelet" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.943579 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-574nd_183616f4-5ea5-4c31-a465-edb5b837ca8f/kube-multus-additional-cni-plugins/0.log" Mar 17 00:25:22 crc kubenswrapper[4755]: I0317 00:25:22.944177 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093003 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") pod \"183616f4-5ea5-4c31-a465-edb5b837ca8f\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h684j\" (UniqueName: \"kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j\") pod \"183616f4-5ea5-4c31-a465-edb5b837ca8f\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir\") pod \"183616f4-5ea5-4c31-a465-edb5b837ca8f\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093198 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready\") pod \"183616f4-5ea5-4c31-a465-edb5b837ca8f\" (UID: \"183616f4-5ea5-4c31-a465-edb5b837ca8f\") " Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093458 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "183616f4-5ea5-4c31-a465-edb5b837ca8f" (UID: "183616f4-5ea5-4c31-a465-edb5b837ca8f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093650 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready" (OuterVolumeSpecName: "ready") pod "183616f4-5ea5-4c31-a465-edb5b837ca8f" (UID: "183616f4-5ea5-4c31-a465-edb5b837ca8f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.093760 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "183616f4-5ea5-4c31-a465-edb5b837ca8f" (UID: "183616f4-5ea5-4c31-a465-edb5b837ca8f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.102975 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j" (OuterVolumeSpecName: "kube-api-access-h684j") pod "183616f4-5ea5-4c31-a465-edb5b837ca8f" (UID: "183616f4-5ea5-4c31-a465-edb5b837ca8f"). InnerVolumeSpecName "kube-api-access-h684j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.194186 4755 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/183616f4-5ea5-4c31-a465-edb5b837ca8f-ready\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.194222 4755 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/183616f4-5ea5-4c31-a465-edb5b837ca8f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.194234 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h684j\" (UniqueName: \"kubernetes.io/projected/183616f4-5ea5-4c31-a465-edb5b837ca8f-kube-api-access-h684j\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.194244 4755 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/183616f4-5ea5-4c31-a465-edb5b837ca8f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.694790 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-574nd_183616f4-5ea5-4c31-a465-edb5b837ca8f/kube-multus-additional-cni-plugins/0.log" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.695188 4755 generic.go:334] "Generic (PLEG): container finished" podID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" exitCode=137 Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.695560 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.695601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" event={"ID":"183616f4-5ea5-4c31-a465-edb5b837ca8f","Type":"ContainerDied","Data":"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2"} Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.696105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-574nd" event={"ID":"183616f4-5ea5-4c31-a465-edb5b837ca8f","Type":"ContainerDied","Data":"fb0f068cc092d41a711feef64d91c822117bd08edd1488302303a6398a6a78ac"} Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.696132 4755 scope.go:117] "RemoveContainer" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.719835 4755 scope.go:117] "RemoveContainer" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" Mar 17 00:25:23 crc kubenswrapper[4755]: E0317 00:25:23.722822 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2\": container with ID starting with 5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2 not found: ID does not exist" containerID="5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.722894 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2"} err="failed to get container status \"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2\": rpc error: code = NotFound desc = could not find container \"5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2\": container with ID starting with 5aa76cf18712f53da1d723b539a77d94f7607fe6a088fcc1f55f037c3baff3b2 not found: ID does not exist" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.725097 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-574nd"] Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.729497 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-574nd"] Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.758459 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:23 crc kubenswrapper[4755]: E0317 00:25:23.758986 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerName="controller-manager" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.759004 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerName="controller-manager" Mar 17 00:25:23 crc kubenswrapper[4755]: E0317 00:25:23.759022 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.759029 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.759233 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5da40d-fd23-4d0b-acdc-b446e6df9b26" containerName="controller-manager" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.759251 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" containerName="kube-multus-additional-cni-plugins" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.759834 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.763105 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.765178 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.765451 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.766455 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.766702 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.768581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.782048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.783978 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.902291 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.902731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.902759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.902807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfdh\" (UniqueName: \"kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.902909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:23 crc kubenswrapper[4755]: I0317 00:25:23.940871 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.007948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfdh\" (UniqueName: \"kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.008003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.008033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.008098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.008115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.009216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.009360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.009740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.022498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.024741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfdh\" (UniqueName: \"kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh\") pod \"controller-manager-666d9fb6d8-79xg9\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.087017 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.108660 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access\") pod \"8138043e-c956-468a-8b5d-4627d3d4a8ea\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.108809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir\") pod \"8138043e-c956-468a-8b5d-4627d3d4a8ea\" (UID: \"8138043e-c956-468a-8b5d-4627d3d4a8ea\") " Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.108888 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8138043e-c956-468a-8b5d-4627d3d4a8ea" (UID: "8138043e-c956-468a-8b5d-4627d3d4a8ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.109057 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8138043e-c956-468a-8b5d-4627d3d4a8ea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.113296 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8138043e-c956-468a-8b5d-4627d3d4a8ea" (UID: "8138043e-c956-468a-8b5d-4627d3d4a8ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.209698 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8138043e-c956-468a-8b5d-4627d3d4a8ea-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.280881 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183616f4-5ea5-4c31-a465-edb5b837ca8f" path="/var/lib/kubelet/pods/183616f4-5ea5-4c31-a465-edb5b837ca8f/volumes" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.361396 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.706940 4755 generic.go:334] "Generic (PLEG): container finished" podID="92c4ceac-e07e-407f-a2d7-5202cc06c29d" containerID="fd0a79ec53683bc6a3105fdcc8c96ab6436047e3c0eb29b421cc78256d3cb84d" exitCode=0 Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.707185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29561760-zz9vv" event={"ID":"92c4ceac-e07e-407f-a2d7-5202cc06c29d","Type":"ContainerDied","Data":"fd0a79ec53683bc6a3105fdcc8c96ab6436047e3c0eb29b421cc78256d3cb84d"} Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.713098 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.713249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8138043e-c956-468a-8b5d-4627d3d4a8ea","Type":"ContainerDied","Data":"90b4a3cb0e95ac868cb603ce5736f0e05f13b32edf287fd2e1ffee6ab1175b3a"} Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.713288 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b4a3cb0e95ac868cb603ce5736f0e05f13b32edf287fd2e1ffee6ab1175b3a" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.714605 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" event={"ID":"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce","Type":"ContainerStarted","Data":"a16894f0639e30d3ae4ef3b0bcefc1e265d37caf534598cd04619ab89ce3e5fb"} Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.714639 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" event={"ID":"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce","Type":"ContainerStarted","Data":"794c1d16a61be7f0a68418867b2f8e6405c864d1299e6148d73facdf5e9ab0ba"} Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.714657 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.738530 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:24 crc kubenswrapper[4755]: I0317 00:25:24.742784 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" podStartSLOduration=12.742765212 podStartE2EDuration="12.742765212s" podCreationTimestamp="2026-03-17 00:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:24.741677273 +0000 UTC m=+199.501129566" watchObservedRunningTime="2026-03-17 00:25:24.742765212 +0000 UTC m=+199.502217495" Mar 17 00:25:25 crc kubenswrapper[4755]: I0317 00:25:25.987423 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.133293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca\") pod \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.133408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8w7m\" (UniqueName: \"kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m\") pod \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\" (UID: \"92c4ceac-e07e-407f-a2d7-5202cc06c29d\") " Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.134228 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca" (OuterVolumeSpecName: "serviceca") pod "92c4ceac-e07e-407f-a2d7-5202cc06c29d" (UID: "92c4ceac-e07e-407f-a2d7-5202cc06c29d"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.140320 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m" (OuterVolumeSpecName: "kube-api-access-v8w7m") pod "92c4ceac-e07e-407f-a2d7-5202cc06c29d" (UID: "92c4ceac-e07e-407f-a2d7-5202cc06c29d"). InnerVolumeSpecName "kube-api-access-v8w7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.234486 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8w7m\" (UniqueName: \"kubernetes.io/projected/92c4ceac-e07e-407f-a2d7-5202cc06c29d-kube-api-access-v8w7m\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.234518 4755 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92c4ceac-e07e-407f-a2d7-5202cc06c29d-serviceca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.729991 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29561760-zz9vv" event={"ID":"92c4ceac-e07e-407f-a2d7-5202cc06c29d","Type":"ContainerDied","Data":"e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf"} Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.730080 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e610cfcb3730f35864289a871a4f0faab6367ece948cc7ec845182fd6fa7bfdf" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.730161 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29561760-zz9vv" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.913604 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:25:26 crc kubenswrapper[4755]: I0317 00:25:26.913732 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.273363 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.312340 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.312485 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.356968 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.777320 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:25:27 crc kubenswrapper[4755]: I0317 00:25:27.788983 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.219866 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m6qr7" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.646668 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 00:25:28 crc kubenswrapper[4755]: E0317 00:25:28.647151 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8138043e-c956-468a-8b5d-4627d3d4a8ea" containerName="pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.647162 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8138043e-c956-468a-8b5d-4627d3d4a8ea" containerName="pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: E0317 00:25:28.647173 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c4ceac-e07e-407f-a2d7-5202cc06c29d" containerName="image-pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.647178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c4ceac-e07e-407f-a2d7-5202cc06c29d" containerName="image-pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.647269 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c4ceac-e07e-407f-a2d7-5202cc06c29d" containerName="image-pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.647279 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8138043e-c956-468a-8b5d-4627d3d4a8ea" containerName="pruner" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.647706 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.649531 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.649675 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.649903 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.673179 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.673237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.774003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.774067 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.774177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.792563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.894017 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:25:28 crc kubenswrapper[4755]: I0317 00:25:28.969686 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:30 crc kubenswrapper[4755]: I0317 00:25:30.752354 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frcfc" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="registry-server" containerID="cri-o://190bdbda2c6fb5e0257bdb265cac5bff9827945807b377e0322f901ced61f4c5" gracePeriod=2 Mar 17 00:25:31 crc kubenswrapper[4755]: I0317 00:25:31.766762 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e74213a-9bd5-440c-b207-5218feab7323" containerID="190bdbda2c6fb5e0257bdb265cac5bff9827945807b377e0322f901ced61f4c5" exitCode=0 Mar 17 00:25:31 crc kubenswrapper[4755]: I0317 00:25:31.766805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerDied","Data":"190bdbda2c6fb5e0257bdb265cac5bff9827945807b377e0322f901ced61f4c5"} Mar 17 00:25:31 crc kubenswrapper[4755]: I0317 00:25:31.905667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.015821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content\") pod \"0e74213a-9bd5-440c-b207-5218feab7323\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.015911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities\") pod \"0e74213a-9bd5-440c-b207-5218feab7323\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.015997 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx958\" (UniqueName: \"kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958\") pod \"0e74213a-9bd5-440c-b207-5218feab7323\" (UID: \"0e74213a-9bd5-440c-b207-5218feab7323\") " Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.016645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities" (OuterVolumeSpecName: "utilities") pod "0e74213a-9bd5-440c-b207-5218feab7323" (UID: "0e74213a-9bd5-440c-b207-5218feab7323"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.021891 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958" (OuterVolumeSpecName: "kube-api-access-fx958") pod "0e74213a-9bd5-440c-b207-5218feab7323" (UID: "0e74213a-9bd5-440c-b207-5218feab7323"). InnerVolumeSpecName "kube-api-access-fx958". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.055147 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e74213a-9bd5-440c-b207-5218feab7323" (UID: "0e74213a-9bd5-440c-b207-5218feab7323"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.075076 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 00:25:32 crc kubenswrapper[4755]: W0317 00:25:32.084225 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod31a6b715_d2ed_46ed_8375_c52975cfa3a0.slice/crio-b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404 WatchSource:0}: Error finding container b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404: Status 404 returned error can't find the container with id b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404 Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.117105 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx958\" (UniqueName: \"kubernetes.io/projected/0e74213a-9bd5-440c-b207-5218feab7323-kube-api-access-fx958\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.117139 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.117149 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74213a-9bd5-440c-b207-5218feab7323-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.380452 4755 csr.go:261] certificate signing request csr-cpd4l is approved, waiting to be issued Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.386342 4755 csr.go:257] certificate signing request csr-cpd4l is issued Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435083 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 00:25:32 crc kubenswrapper[4755]: E0317 00:25:32.435288 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="registry-server" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435299 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="registry-server" Mar 17 00:25:32 crc kubenswrapper[4755]: E0317 00:25:32.435309 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="extract-content" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435315 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="extract-content" Mar 17 00:25:32 crc kubenswrapper[4755]: E0317 00:25:32.435324 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="extract-utilities" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435331 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="extract-utilities" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435422 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e74213a-9bd5-440c-b207-5218feab7323" containerName="registry-server" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.435795 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.442655 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.624351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.624502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.624545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.725972 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.726031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.726102 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.726107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.726191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.743429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access\") pod \"installer-9-crc\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.771354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.773946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"31a6b715-d2ed-46ed-8375-c52975cfa3a0","Type":"ContainerStarted","Data":"13aa2fa88657645d235d6014b8026441b71e709516bc6d9c725f45360c62a02a"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.773989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"31a6b715-d2ed-46ed-8375-c52975cfa3a0","Type":"ContainerStarted","Data":"b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.775977 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerID="286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9" exitCode=0 Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.776031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerDied","Data":"286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.777714 4755 generic.go:334] "Generic (PLEG): container finished" podID="d46784e1-420c-4d3b-aca7-65271a898c44" containerID="dfa84351c1713b5382135a808f7dcca826e88836f4d77efd458030120a543c18" exitCode=0 Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.777783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561784-zznp2" event={"ID":"d46784e1-420c-4d3b-aca7-65271a898c44","Type":"ContainerDied","Data":"dfa84351c1713b5382135a808f7dcca826e88836f4d77efd458030120a543c18"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.780149 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frcfc" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.780152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frcfc" event={"ID":"0e74213a-9bd5-440c-b207-5218feab7323","Type":"ContainerDied","Data":"2ce7c7ce1620105542aa193e38515de933f2778cfd24c6a12eab3db9e40270be"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.780197 4755 scope.go:117] "RemoveContainer" containerID="190bdbda2c6fb5e0257bdb265cac5bff9827945807b377e0322f901ced61f4c5" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.781729 4755 generic.go:334] "Generic (PLEG): container finished" podID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerID="45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea" exitCode=0 Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.781758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerDied","Data":"45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea"} Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.793119 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.793103645 podStartE2EDuration="4.793103645s" podCreationTimestamp="2026-03-17 00:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:32.791347749 +0000 UTC m=+207.550800032" watchObservedRunningTime="2026-03-17 00:25:32.793103645 +0000 UTC m=+207.552555928" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.848739 4755 scope.go:117] "RemoveContainer" containerID="c0dc613d158f33a95413d8894c2fc15cf7bc05df1bf6c0c2d2852c6ef28f4956" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.859951 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.872032 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frcfc"] Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.887725 4755 scope.go:117] "RemoveContainer" containerID="17b33ef72c33630f68823cb285a635c7a409fd5e0f5bed2d183429f4dddea32f" Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.963895 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:32 crc kubenswrapper[4755]: I0317 00:25:32.964110 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" podUID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" containerName="controller-manager" containerID="cri-o://a16894f0639e30d3ae4ef3b0bcefc1e265d37caf534598cd04619ab89ce3e5fb" gracePeriod=30 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.058175 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.058595 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" podUID="dfadefec-2153-48d4-bf07-56d7bd41e16c" containerName="route-controller-manager" containerID="cri-o://001f82566432aafbb66eea4ba4e5f0c3c387342a1cb192cc2384b7868a118d62" gracePeriod=30 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.200102 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 00:25:33 crc kubenswrapper[4755]: W0317 00:25:33.211376 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3c703cb5_e96c_4bb0_a6d3_4e4c2a3cfb46.slice/crio-2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5 WatchSource:0}: Error finding container 2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5: Status 404 returned error can't find the container with id 2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.387243 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 03:55:31.113272072 +0000 UTC Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.387292 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6627h29m57.725983202s for next certificate rotation Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.788873 4755 generic.go:334] "Generic (PLEG): container finished" podID="dfadefec-2153-48d4-bf07-56d7bd41e16c" containerID="001f82566432aafbb66eea4ba4e5f0c3c387342a1cb192cc2384b7868a118d62" exitCode=0 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.788961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" event={"ID":"dfadefec-2153-48d4-bf07-56d7bd41e16c","Type":"ContainerDied","Data":"001f82566432aafbb66eea4ba4e5f0c3c387342a1cb192cc2384b7868a118d62"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.790603 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1339920-3dec-4332-9749-ec66520252cb" containerID="d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5" exitCode=0 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.790676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerDied","Data":"d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.843577 4755 generic.go:334] "Generic (PLEG): container finished" podID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" containerID="a16894f0639e30d3ae4ef3b0bcefc1e265d37caf534598cd04619ab89ce3e5fb" exitCode=0 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.843672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" event={"ID":"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce","Type":"ContainerDied","Data":"a16894f0639e30d3ae4ef3b0bcefc1e265d37caf534598cd04619ab89ce3e5fb"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.851536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerStarted","Data":"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.862722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46","Type":"ContainerStarted","Data":"7dbddef2946342f733c6dfc2a879514f48551c9fcc9c66478931b938b8887107"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.862787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46","Type":"ContainerStarted","Data":"2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.865596 4755 generic.go:334] "Generic (PLEG): container finished" podID="31a6b715-d2ed-46ed-8375-c52975cfa3a0" containerID="13aa2fa88657645d235d6014b8026441b71e709516bc6d9c725f45360c62a02a" exitCode=0 Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.865831 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"31a6b715-d2ed-46ed-8375-c52975cfa3a0","Type":"ContainerDied","Data":"13aa2fa88657645d235d6014b8026441b71e709516bc6d9c725f45360c62a02a"} Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.871779 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ctv9" podStartSLOduration=24.874766795 podStartE2EDuration="36.871767017s" podCreationTimestamp="2026-03-17 00:24:57 +0000 UTC" firstStartedPulling="2026-03-17 00:25:21.592624604 +0000 UTC m=+196.352076887" lastFinishedPulling="2026-03-17 00:25:33.589624826 +0000 UTC m=+208.349077109" observedRunningTime="2026-03-17 00:25:33.86961134 +0000 UTC m=+208.629063633" watchObservedRunningTime="2026-03-17 00:25:33.871767017 +0000 UTC m=+208.631219300" Mar 17 00:25:33 crc kubenswrapper[4755]: I0317 00:25:33.907417 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.907402561 podStartE2EDuration="1.907402561s" podCreationTimestamp="2026-03-17 00:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:33.904345841 +0000 UTC m=+208.663798124" watchObservedRunningTime="2026-03-17 00:25:33.907402561 +0000 UTC m=+208.666854844" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.009951 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.013420 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.112880 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.148336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config\") pod \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.148567 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert\") pod \"dfadefec-2153-48d4-bf07-56d7bd41e16c\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.148702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfdh\" (UniqueName: \"kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh\") pod \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.148843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles\") pod \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.148921 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvlsg\" (UniqueName: \"kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg\") pod \"d46784e1-420c-4d3b-aca7-65271a898c44\" (UID: \"d46784e1-420c-4d3b-aca7-65271a898c44\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca\") pod \"dfadefec-2153-48d4-bf07-56d7bd41e16c\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149118 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca\") pod \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt7c2\" (UniqueName: \"kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2\") pod \"dfadefec-2153-48d4-bf07-56d7bd41e16c\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert\") pod \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\" (UID: \"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149424 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config\") pod \"dfadefec-2153-48d4-bf07-56d7bd41e16c\" (UID: \"dfadefec-2153-48d4-bf07-56d7bd41e16c\") " Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.149328 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config" (OuterVolumeSpecName: "config") pod "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" (UID: "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.150671 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" (UID: "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.152180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" (UID: "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.152662 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config" (OuterVolumeSpecName: "config") pod "dfadefec-2153-48d4-bf07-56d7bd41e16c" (UID: "dfadefec-2153-48d4-bf07-56d7bd41e16c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.153058 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca" (OuterVolumeSpecName: "client-ca") pod "dfadefec-2153-48d4-bf07-56d7bd41e16c" (UID: "dfadefec-2153-48d4-bf07-56d7bd41e16c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.154693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2" (OuterVolumeSpecName: "kube-api-access-mt7c2") pod "dfadefec-2153-48d4-bf07-56d7bd41e16c" (UID: "dfadefec-2153-48d4-bf07-56d7bd41e16c"). InnerVolumeSpecName "kube-api-access-mt7c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.154836 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dfadefec-2153-48d4-bf07-56d7bd41e16c" (UID: "dfadefec-2153-48d4-bf07-56d7bd41e16c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.155050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" (UID: "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.156284 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg" (OuterVolumeSpecName: "kube-api-access-jvlsg") pod "d46784e1-420c-4d3b-aca7-65271a898c44" (UID: "d46784e1-420c-4d3b-aca7-65271a898c44"). InnerVolumeSpecName "kube-api-access-jvlsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.160617 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh" (OuterVolumeSpecName: "kube-api-access-xqfdh") pod "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" (UID: "935720d2-7b1a-4d62-a4b2-7396b8c6d1ce"). InnerVolumeSpecName "kube-api-access-xqfdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252540 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252564 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvlsg\" (UniqueName: \"kubernetes.io/projected/d46784e1-420c-4d3b-aca7-65271a898c44-kube-api-access-jvlsg\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252576 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252585 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252595 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt7c2\" (UniqueName: \"kubernetes.io/projected/dfadefec-2153-48d4-bf07-56d7bd41e16c-kube-api-access-mt7c2\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252603 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252611 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfadefec-2153-48d4-bf07-56d7bd41e16c-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252619 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252626 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfadefec-2153-48d4-bf07-56d7bd41e16c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.252634 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfdh\" (UniqueName: \"kubernetes.io/projected/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce-kube-api-access-xqfdh\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.255961 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e74213a-9bd5-440c-b207-5218feab7323" path="/var/lib/kubelet/pods/0e74213a-9bd5-440c-b207-5218feab7323/volumes" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.388240 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 12:46:35.721823604 +0000 UTC Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.388277 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6036h21m1.3335492s for next certificate rotation Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747042 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:34 crc kubenswrapper[4755]: E0317 00:25:34.747739 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" containerName="controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747756 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" containerName="controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: E0317 00:25:34.747773 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfadefec-2153-48d4-bf07-56d7bd41e16c" containerName="route-controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747779 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfadefec-2153-48d4-bf07-56d7bd41e16c" containerName="route-controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: E0317 00:25:34.747789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" containerName="oc" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747796 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" containerName="oc" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747879 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" containerName="controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747896 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfadefec-2153-48d4-bf07-56d7bd41e16c" containerName="route-controller-manager" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.747903 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" containerName="oc" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.748260 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.757942 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrff9\" (UniqueName: \"kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.757989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.758053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.758071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.763108 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.859145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.859189 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.859259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrff9\" (UniqueName: \"kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.859290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.860499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.861060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.863026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.873315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" event={"ID":"935720d2-7b1a-4d62-a4b2-7396b8c6d1ce","Type":"ContainerDied","Data":"794c1d16a61be7f0a68418867b2f8e6405c864d1299e6148d73facdf5e9ab0ba"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.873387 4755 scope.go:117] "RemoveContainer" containerID="a16894f0639e30d3ae4ef3b0bcefc1e265d37caf534598cd04619ab89ce3e5fb" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.873329 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666d9fb6d8-79xg9" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.875272 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561784-zznp2" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.875810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561784-zznp2" event={"ID":"d46784e1-420c-4d3b-aca7-65271a898c44","Type":"ContainerDied","Data":"c6c61e06012714d07cdb0579e6a2840ee42505b94ad8c4c991f7760e3568c6da"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.875916 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c61e06012714d07cdb0579e6a2840ee42505b94ad8c4c991f7760e3568c6da" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.878189 4755 generic.go:334] "Generic (PLEG): container finished" podID="be55626c-4d34-4b09-83b0-897cd661216a" containerID="29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3" exitCode=0 Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.878322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerDied","Data":"29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.879799 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.879915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz" event={"ID":"dfadefec-2153-48d4-bf07-56d7bd41e16c","Type":"ContainerDied","Data":"326ca162c59ec29c95ae424de699590dabfb9bc495182b72c645bb0c64da663e"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.889240 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrff9\" (UniqueName: \"kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9\") pod \"route-controller-manager-659697f79c-5zj25\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.891839 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerStarted","Data":"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.894693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerStarted","Data":"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350"} Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.920416 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.923400 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.926026 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dwldz" podStartSLOduration=2.364466601 podStartE2EDuration="40.926005657s" podCreationTimestamp="2026-03-17 00:24:54 +0000 UTC" firstStartedPulling="2026-03-17 00:24:56.124274361 +0000 UTC m=+170.883726654" lastFinishedPulling="2026-03-17 00:25:34.685813427 +0000 UTC m=+209.445265710" observedRunningTime="2026-03-17 00:25:34.925149414 +0000 UTC m=+209.684601697" watchObservedRunningTime="2026-03-17 00:25:34.926005657 +0000 UTC m=+209.685457930" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.932780 4755 scope.go:117] "RemoveContainer" containerID="001f82566432aafbb66eea4ba4e5f0c3c387342a1cb192cc2384b7868a118d62" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.950619 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxkzg" podStartSLOduration=25.925380253 podStartE2EDuration="36.950606852s" podCreationTimestamp="2026-03-17 00:24:58 +0000 UTC" firstStartedPulling="2026-03-17 00:25:22.635686287 +0000 UTC m=+197.395138570" lastFinishedPulling="2026-03-17 00:25:33.660912886 +0000 UTC m=+208.420365169" observedRunningTime="2026-03-17 00:25:34.949731049 +0000 UTC m=+209.709183332" watchObservedRunningTime="2026-03-17 00:25:34.950606852 +0000 UTC m=+209.710059135" Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.964610 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.968065 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-666d9fb6d8-79xg9"] Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.979388 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:34 crc kubenswrapper[4755]: I0317 00:25:34.981024 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd4f49fc8-qvlqz"] Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.089289 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.154920 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.263139 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir\") pod \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.263216 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "31a6b715-d2ed-46ed-8375-c52975cfa3a0" (UID: "31a6b715-d2ed-46ed-8375-c52975cfa3a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.263364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access\") pod \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\" (UID: \"31a6b715-d2ed-46ed-8375-c52975cfa3a0\") " Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.263729 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.267703 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "31a6b715-d2ed-46ed-8375-c52975cfa3a0" (UID: "31a6b715-d2ed-46ed-8375-c52975cfa3a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.364419 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a6b715-d2ed-46ed-8375-c52975cfa3a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.564208 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:35 crc kubenswrapper[4755]: W0317 00:25:35.564891 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54c1112_6e1f_4c17_b207_4935524c6add.slice/crio-bc780f4e8ccae6591963fee700946dc07533303c4d70e122d58b765e404e69ce WatchSource:0}: Error finding container bc780f4e8ccae6591963fee700946dc07533303c4d70e122d58b765e404e69ce: Status 404 returned error can't find the container with id bc780f4e8ccae6591963fee700946dc07533303c4d70e122d58b765e404e69ce Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.902173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"31a6b715-d2ed-46ed-8375-c52975cfa3a0","Type":"ContainerDied","Data":"b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404"} Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.902532 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17e4fb1c01ddc172485f12552ddeea7490d9fcf6f4e910f1a96251eb0488404" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.902195 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.904035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerStarted","Data":"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb"} Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.907859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" event={"ID":"f54c1112-6e1f-4c17-b207-4935524c6add","Type":"ContainerStarted","Data":"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8"} Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.907904 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" event={"ID":"f54c1112-6e1f-4c17-b207-4935524c6add","Type":"ContainerStarted","Data":"bc780f4e8ccae6591963fee700946dc07533303c4d70e122d58b765e404e69ce"} Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.907927 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.911027 4755 generic.go:334] "Generic (PLEG): container finished" podID="1089c661-4f7b-40e2-8549-983b06c1409a" containerID="bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33" exitCode=0 Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.911540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerDied","Data":"bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33"} Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.924711 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m967j" podStartSLOduration=2.27035619 podStartE2EDuration="41.92469854s" podCreationTimestamp="2026-03-17 00:24:54 +0000 UTC" firstStartedPulling="2026-03-17 00:24:56.073283901 +0000 UTC m=+170.832736184" lastFinishedPulling="2026-03-17 00:25:35.727626241 +0000 UTC m=+210.487078534" observedRunningTime="2026-03-17 00:25:35.924181607 +0000 UTC m=+210.683633910" watchObservedRunningTime="2026-03-17 00:25:35.92469854 +0000 UTC m=+210.684150823" Mar 17 00:25:35 crc kubenswrapper[4755]: I0317 00:25:35.961929 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dwldz" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="registry-server" probeResult="failure" output=< Mar 17 00:25:35 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:25:35 crc kubenswrapper[4755]: > Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.127164 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.141461 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" podStartSLOduration=3.141427235 podStartE2EDuration="3.141427235s" podCreationTimestamp="2026-03-17 00:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:35.976590771 +0000 UTC m=+210.736043074" watchObservedRunningTime="2026-03-17 00:25:36.141427235 +0000 UTC m=+210.900879518" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.258313 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935720d2-7b1a-4d62-a4b2-7396b8c6d1ce" path="/var/lib/kubelet/pods/935720d2-7b1a-4d62-a4b2-7396b8c6d1ce/volumes" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.258982 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfadefec-2153-48d4-bf07-56d7bd41e16c" path="/var/lib/kubelet/pods/dfadefec-2153-48d4-bf07-56d7bd41e16c/volumes" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.751139 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:36 crc kubenswrapper[4755]: E0317 00:25:36.752042 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a6b715-d2ed-46ed-8375-c52975cfa3a0" containerName="pruner" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.752072 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a6b715-d2ed-46ed-8375-c52975cfa3a0" containerName="pruner" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.752395 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a6b715-d2ed-46ed-8375-c52975cfa3a0" containerName="pruner" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.753244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.762948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.763275 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.763425 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.763585 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.763743 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.763889 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.778664 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.784404 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.883391 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.883450 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fpbh\" (UniqueName: \"kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.883516 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.883534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.883558 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.925222 4755 generic.go:334] "Generic (PLEG): container finished" podID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerID="457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612" exitCode=0 Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.925299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerDied","Data":"457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612"} Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.985009 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.985055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.985451 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.985534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.985557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fpbh\" (UniqueName: \"kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.986075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.987026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:36 crc kubenswrapper[4755]: I0317 00:25:36.987067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.001226 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fpbh\" (UniqueName: \"kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.002475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert\") pod \"controller-manager-5d784bd974-96gbc\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.071808 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.279172 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.932295 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerStarted","Data":"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c"} Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.934756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" event={"ID":"b43ce753-6964-468d-b045-09b2e0135dcf","Type":"ContainerStarted","Data":"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258"} Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.934810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" event={"ID":"b43ce753-6964-468d-b045-09b2e0135dcf","Type":"ContainerStarted","Data":"4d8be0f4f5428c666acfbd118a283482c280baff9dd82fa296c914287bfc8b2c"} Mar 17 00:25:37 crc kubenswrapper[4755]: I0317 00:25:37.951942 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v9fxv" podStartSLOduration=2.258647566 podStartE2EDuration="42.95191986s" podCreationTimestamp="2026-03-17 00:24:55 +0000 UTC" firstStartedPulling="2026-03-17 00:24:56.08869211 +0000 UTC m=+170.848144393" lastFinishedPulling="2026-03-17 00:25:36.781964404 +0000 UTC m=+211.541416687" observedRunningTime="2026-03-17 00:25:37.949502267 +0000 UTC m=+212.708954560" watchObservedRunningTime="2026-03-17 00:25:37.95191986 +0000 UTC m=+212.711372143" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.328626 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.328670 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.710696 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.710760 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.942611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerStarted","Data":"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc"} Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.942996 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.948924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.963219 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" podStartSLOduration=6.963200564 podStartE2EDuration="6.963200564s" podCreationTimestamp="2026-03-17 00:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:38.961382626 +0000 UTC m=+213.720834909" watchObservedRunningTime="2026-03-17 00:25:38.963200564 +0000 UTC m=+213.722652857" Mar 17 00:25:38 crc kubenswrapper[4755]: I0317 00:25:38.982745 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ct6rl" podStartSLOduration=3.4138214749999998 podStartE2EDuration="44.982726786s" podCreationTimestamp="2026-03-17 00:24:54 +0000 UTC" firstStartedPulling="2026-03-17 00:24:56.147989275 +0000 UTC m=+170.907441558" lastFinishedPulling="2026-03-17 00:25:37.716894586 +0000 UTC m=+212.476346869" observedRunningTime="2026-03-17 00:25:38.981675858 +0000 UTC m=+213.741128151" watchObservedRunningTime="2026-03-17 00:25:38.982726786 +0000 UTC m=+213.742179069" Mar 17 00:25:39 crc kubenswrapper[4755]: I0317 00:25:39.365296 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ctv9" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="registry-server" probeResult="failure" output=< Mar 17 00:25:39 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:25:39 crc kubenswrapper[4755]: > Mar 17 00:25:39 crc kubenswrapper[4755]: I0317 00:25:39.660813 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:25:39 crc kubenswrapper[4755]: I0317 00:25:39.749568 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zxkzg" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="registry-server" probeResult="failure" output=< Mar 17 00:25:39 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:25:39 crc kubenswrapper[4755]: > Mar 17 00:25:44 crc kubenswrapper[4755]: I0317 00:25:44.994135 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.070019 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.102263 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.103162 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.182366 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.299521 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.299607 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.352283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.523511 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.523571 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:45 crc kubenswrapper[4755]: I0317 00:25:45.580875 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:46 crc kubenswrapper[4755]: I0317 00:25:46.041702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:46 crc kubenswrapper[4755]: I0317 00:25:46.057471 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:25:46 crc kubenswrapper[4755]: I0317 00:25:46.058776 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:47 crc kubenswrapper[4755]: I0317 00:25:47.038679 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:25:47 crc kubenswrapper[4755]: I0317 00:25:47.637181 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:25:47 crc kubenswrapper[4755]: I0317 00:25:47.998038 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v9fxv" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="registry-server" containerID="cri-o://d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c" gracePeriod=2 Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.388545 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.443320 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.519254 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.555697 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpd28\" (UniqueName: \"kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28\") pod \"1089c661-4f7b-40e2-8549-983b06c1409a\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.555805 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities\") pod \"1089c661-4f7b-40e2-8549-983b06c1409a\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.555886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content\") pod \"1089c661-4f7b-40e2-8549-983b06c1409a\" (UID: \"1089c661-4f7b-40e2-8549-983b06c1409a\") " Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.557209 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities" (OuterVolumeSpecName: "utilities") pod "1089c661-4f7b-40e2-8549-983b06c1409a" (UID: "1089c661-4f7b-40e2-8549-983b06c1409a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.563711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28" (OuterVolumeSpecName: "kube-api-access-bpd28") pod "1089c661-4f7b-40e2-8549-983b06c1409a" (UID: "1089c661-4f7b-40e2-8549-983b06c1409a"). InnerVolumeSpecName "kube-api-access-bpd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.636664 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1089c661-4f7b-40e2-8549-983b06c1409a" (UID: "1089c661-4f7b-40e2-8549-983b06c1409a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.657209 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.657259 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1089c661-4f7b-40e2-8549-983b06c1409a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.657279 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpd28\" (UniqueName: \"kubernetes.io/projected/1089c661-4f7b-40e2-8549-983b06c1409a-kube-api-access-bpd28\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.770307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:48 crc kubenswrapper[4755]: I0317 00:25:48.822698 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.009947 4755 generic.go:334] "Generic (PLEG): container finished" podID="1089c661-4f7b-40e2-8549-983b06c1409a" containerID="d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c" exitCode=0 Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.010014 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerDied","Data":"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c"} Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.010082 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v9fxv" event={"ID":"1089c661-4f7b-40e2-8549-983b06c1409a","Type":"ContainerDied","Data":"b74023147a2a8d0c12bd5643bd1b221a7ac6835632315149b323ec6c65902259"} Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.010121 4755 scope.go:117] "RemoveContainer" containerID="d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.010121 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v9fxv" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.014800 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ct6rl" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="registry-server" containerID="cri-o://41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc" gracePeriod=2 Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.035350 4755 scope.go:117] "RemoveContainer" containerID="bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.068540 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.073272 4755 scope.go:117] "RemoveContainer" containerID="f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.079482 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v9fxv"] Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.105017 4755 scope.go:117] "RemoveContainer" containerID="d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c" Mar 17 00:25:49 crc kubenswrapper[4755]: E0317 00:25:49.105507 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c\": container with ID starting with d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c not found: ID does not exist" containerID="d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.105548 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c"} err="failed to get container status \"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c\": rpc error: code = NotFound desc = could not find container \"d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c\": container with ID starting with d624f5c3a0a9831efd7f1bc1eb0f48b3d0bb710c23ca723b6550633eff8d7b4c not found: ID does not exist" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.105572 4755 scope.go:117] "RemoveContainer" containerID="bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33" Mar 17 00:25:49 crc kubenswrapper[4755]: E0317 00:25:49.106106 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33\": container with ID starting with bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33 not found: ID does not exist" containerID="bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.106155 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33"} err="failed to get container status \"bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33\": rpc error: code = NotFound desc = could not find container \"bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33\": container with ID starting with bfa2e3af864a19584a8afbad690e0663e20805d4228ea3a5fe139655e08dab33 not found: ID does not exist" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.106182 4755 scope.go:117] "RemoveContainer" containerID="f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000" Mar 17 00:25:49 crc kubenswrapper[4755]: E0317 00:25:49.106822 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000\": container with ID starting with f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000 not found: ID does not exist" containerID="f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.106856 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000"} err="failed to get container status \"f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000\": rpc error: code = NotFound desc = could not find container \"f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000\": container with ID starting with f5be45293c8f2d671fd5d9c6c6b495b2134b484271d57fa7dee93a0dffd99000 not found: ID does not exist" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.530688 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.569991 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h8bd\" (UniqueName: \"kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd\") pod \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.570062 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content\") pod \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.570151 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities\") pod \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\" (UID: \"a23b6da4-12f1-4104-93ba-dfc06d3572aa\") " Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.571183 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities" (OuterVolumeSpecName: "utilities") pod "a23b6da4-12f1-4104-93ba-dfc06d3572aa" (UID: "a23b6da4-12f1-4104-93ba-dfc06d3572aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.576794 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd" (OuterVolumeSpecName: "kube-api-access-2h8bd") pod "a23b6da4-12f1-4104-93ba-dfc06d3572aa" (UID: "a23b6da4-12f1-4104-93ba-dfc06d3572aa"). InnerVolumeSpecName "kube-api-access-2h8bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.633670 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a23b6da4-12f1-4104-93ba-dfc06d3572aa" (UID: "a23b6da4-12f1-4104-93ba-dfc06d3572aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.671771 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h8bd\" (UniqueName: \"kubernetes.io/projected/a23b6da4-12f1-4104-93ba-dfc06d3572aa-kube-api-access-2h8bd\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.671799 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:49 crc kubenswrapper[4755]: I0317 00:25:49.671808 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23b6da4-12f1-4104-93ba-dfc06d3572aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.021149 4755 generic.go:334] "Generic (PLEG): container finished" podID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerID="41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc" exitCode=0 Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.021242 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ct6rl" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.021248 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerDied","Data":"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc"} Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.021423 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ct6rl" event={"ID":"a23b6da4-12f1-4104-93ba-dfc06d3572aa","Type":"ContainerDied","Data":"fd136e1a7396c88bc324998bfee96e462350cc777ad0a5a1d35b1a03f938b441"} Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.021521 4755 scope.go:117] "RemoveContainer" containerID="41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.049856 4755 scope.go:117] "RemoveContainer" containerID="457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.075575 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.083041 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ct6rl"] Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.098299 4755 scope.go:117] "RemoveContainer" containerID="f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.117581 4755 scope.go:117] "RemoveContainer" containerID="41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc" Mar 17 00:25:50 crc kubenswrapper[4755]: E0317 00:25:50.118116 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc\": container with ID starting with 41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc not found: ID does not exist" containerID="41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.118190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc"} err="failed to get container status \"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc\": rpc error: code = NotFound desc = could not find container \"41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc\": container with ID starting with 41debaa6d6c1c1a435b5a947f0d708f6b615e84ad775af4f57b52818577ed5fc not found: ID does not exist" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.118247 4755 scope.go:117] "RemoveContainer" containerID="457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612" Mar 17 00:25:50 crc kubenswrapper[4755]: E0317 00:25:50.118806 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612\": container with ID starting with 457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612 not found: ID does not exist" containerID="457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.118959 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612"} err="failed to get container status \"457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612\": rpc error: code = NotFound desc = could not find container \"457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612\": container with ID starting with 457891be419f7618b15e1d635130b5964dfcbcce27d9d65359c37196ac24e612 not found: ID does not exist" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.119000 4755 scope.go:117] "RemoveContainer" containerID="f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7" Mar 17 00:25:50 crc kubenswrapper[4755]: E0317 00:25:50.119345 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7\": container with ID starting with f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7 not found: ID does not exist" containerID="f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.119420 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7"} err="failed to get container status \"f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7\": rpc error: code = NotFound desc = could not find container \"f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7\": container with ID starting with f3f9ca88d6951800e90cf2ff5c01652c87a43730b5aefcb7866b79c95d95a7e7 not found: ID does not exist" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.260537 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" path="/var/lib/kubelet/pods/1089c661-4f7b-40e2-8549-983b06c1409a/volumes" Mar 17 00:25:50 crc kubenswrapper[4755]: I0317 00:25:50.261857 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" path="/var/lib/kubelet/pods/a23b6da4-12f1-4104-93ba-dfc06d3572aa/volumes" Mar 17 00:25:51 crc kubenswrapper[4755]: I0317 00:25:51.439992 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:25:51 crc kubenswrapper[4755]: I0317 00:25:51.441174 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxkzg" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="registry-server" containerID="cri-o://24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350" gracePeriod=2 Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.001643 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.041963 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerID="24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350" exitCode=0 Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.042021 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxkzg" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.042031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerDied","Data":"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350"} Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.042499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxkzg" event={"ID":"e3bffbce-4004-4c41-9711-28fa7700e14a","Type":"ContainerDied","Data":"62e35dd821f493aa17787d49afdaa279cbbc4f05306aea1edfca40ba3a436440"} Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.042524 4755 scope.go:117] "RemoveContainer" containerID="24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.061398 4755 scope.go:117] "RemoveContainer" containerID="286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.093111 4755 scope.go:117] "RemoveContainer" containerID="571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.114477 4755 scope.go:117] "RemoveContainer" containerID="24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350" Mar 17 00:25:52 crc kubenswrapper[4755]: E0317 00:25:52.114904 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350\": container with ID starting with 24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350 not found: ID does not exist" containerID="24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.114970 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350"} err="failed to get container status \"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350\": rpc error: code = NotFound desc = could not find container \"24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350\": container with ID starting with 24437be73493e3fc138de62c617803b2a5b449fae6ad388bb50df9e86919f350 not found: ID does not exist" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.115011 4755 scope.go:117] "RemoveContainer" containerID="286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9" Mar 17 00:25:52 crc kubenswrapper[4755]: E0317 00:25:52.115312 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9\": container with ID starting with 286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9 not found: ID does not exist" containerID="286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.115345 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9"} err="failed to get container status \"286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9\": rpc error: code = NotFound desc = could not find container \"286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9\": container with ID starting with 286633d4989d2a5f213f668a60a949139bbe54d2ce642ae3350ffd2145edb2c9 not found: ID does not exist" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.115369 4755 scope.go:117] "RemoveContainer" containerID="571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe" Mar 17 00:25:52 crc kubenswrapper[4755]: E0317 00:25:52.115596 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe\": container with ID starting with 571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe not found: ID does not exist" containerID="571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.115623 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe"} err="failed to get container status \"571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe\": rpc error: code = NotFound desc = could not find container \"571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe\": container with ID starting with 571237d5b40d6643bbedf59fa6b8e627fe34a6110439b2b59b48569b70e69cbe not found: ID does not exist" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.203165 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content\") pod \"e3bffbce-4004-4c41-9711-28fa7700e14a\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.203292 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities\") pod \"e3bffbce-4004-4c41-9711-28fa7700e14a\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.203353 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnxsp\" (UniqueName: \"kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp\") pod \"e3bffbce-4004-4c41-9711-28fa7700e14a\" (UID: \"e3bffbce-4004-4c41-9711-28fa7700e14a\") " Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.204657 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities" (OuterVolumeSpecName: "utilities") pod "e3bffbce-4004-4c41-9711-28fa7700e14a" (UID: "e3bffbce-4004-4c41-9711-28fa7700e14a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.211995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp" (OuterVolumeSpecName: "kube-api-access-vnxsp") pod "e3bffbce-4004-4c41-9711-28fa7700e14a" (UID: "e3bffbce-4004-4c41-9711-28fa7700e14a"). InnerVolumeSpecName "kube-api-access-vnxsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.306042 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.308259 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnxsp\" (UniqueName: \"kubernetes.io/projected/e3bffbce-4004-4c41-9711-28fa7700e14a-kube-api-access-vnxsp\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.373282 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.417824 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3bffbce-4004-4c41-9711-28fa7700e14a" (UID: "e3bffbce-4004-4c41-9711-28fa7700e14a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.511283 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3bffbce-4004-4c41-9711-28fa7700e14a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.694401 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.698983 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxkzg"] Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.975512 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.975977 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" podUID="b43ce753-6964-468d-b045-09b2e0135dcf" containerName="controller-manager" containerID="cri-o://a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258" gracePeriod=30 Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.996947 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:52 crc kubenswrapper[4755]: I0317 00:25:52.997512 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" podUID="f54c1112-6e1f-4c17-b207-4935524c6add" containerName="route-controller-manager" containerID="cri-o://7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8" gracePeriod=30 Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.494483 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.503671 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fpbh\" (UniqueName: \"kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh\") pod \"b43ce753-6964-468d-b045-09b2e0135dcf\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert\") pod \"f54c1112-6e1f-4c17-b207-4935524c6add\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629777 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca\") pod \"f54c1112-6e1f-4c17-b207-4935524c6add\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config\") pod \"b43ce753-6964-468d-b045-09b2e0135dcf\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629830 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles\") pod \"b43ce753-6964-468d-b045-09b2e0135dcf\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrff9\" (UniqueName: \"kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9\") pod \"f54c1112-6e1f-4c17-b207-4935524c6add\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629906 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert\") pod \"b43ce753-6964-468d-b045-09b2e0135dcf\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629967 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca\") pod \"b43ce753-6964-468d-b045-09b2e0135dcf\" (UID: \"b43ce753-6964-468d-b045-09b2e0135dcf\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.629995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config\") pod \"f54c1112-6e1f-4c17-b207-4935524c6add\" (UID: \"f54c1112-6e1f-4c17-b207-4935524c6add\") " Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.630907 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config" (OuterVolumeSpecName: "config") pod "f54c1112-6e1f-4c17-b207-4935524c6add" (UID: "f54c1112-6e1f-4c17-b207-4935524c6add"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.631506 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b43ce753-6964-468d-b045-09b2e0135dcf" (UID: "b43ce753-6964-468d-b045-09b2e0135dcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.631521 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "b43ce753-6964-468d-b045-09b2e0135dcf" (UID: "b43ce753-6964-468d-b045-09b2e0135dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.632198 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca" (OuterVolumeSpecName: "client-ca") pod "f54c1112-6e1f-4c17-b207-4935524c6add" (UID: "f54c1112-6e1f-4c17-b207-4935524c6add"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.632811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config" (OuterVolumeSpecName: "config") pod "b43ce753-6964-468d-b045-09b2e0135dcf" (UID: "b43ce753-6964-468d-b045-09b2e0135dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.634924 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b43ce753-6964-468d-b045-09b2e0135dcf" (UID: "b43ce753-6964-468d-b045-09b2e0135dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.635206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh" (OuterVolumeSpecName: "kube-api-access-9fpbh") pod "b43ce753-6964-468d-b045-09b2e0135dcf" (UID: "b43ce753-6964-468d-b045-09b2e0135dcf"). InnerVolumeSpecName "kube-api-access-9fpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.635994 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9" (OuterVolumeSpecName: "kube-api-access-qrff9") pod "f54c1112-6e1f-4c17-b207-4935524c6add" (UID: "f54c1112-6e1f-4c17-b207-4935524c6add"). InnerVolumeSpecName "kube-api-access-qrff9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.641352 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f54c1112-6e1f-4c17-b207-4935524c6add" (UID: "f54c1112-6e1f-4c17-b207-4935524c6add"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731849 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731898 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731916 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731931 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrff9\" (UniqueName: \"kubernetes.io/projected/f54c1112-6e1f-4c17-b207-4935524c6add-kube-api-access-qrff9\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731943 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ce753-6964-468d-b045-09b2e0135dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731958 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ce753-6964-468d-b045-09b2e0135dcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731973 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54c1112-6e1f-4c17-b207-4935524c6add-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.731989 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fpbh\" (UniqueName: \"kubernetes.io/projected/b43ce753-6964-468d-b045-09b2e0135dcf-kube-api-access-9fpbh\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:53 crc kubenswrapper[4755]: I0317 00:25:53.732004 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54c1112-6e1f-4c17-b207-4935524c6add-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.057904 4755 generic.go:334] "Generic (PLEG): container finished" podID="b43ce753-6964-468d-b045-09b2e0135dcf" containerID="a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258" exitCode=0 Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.058076 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.058667 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" event={"ID":"b43ce753-6964-468d-b045-09b2e0135dcf","Type":"ContainerDied","Data":"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258"} Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.059060 4755 scope.go:117] "RemoveContainer" containerID="a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.061031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d784bd974-96gbc" event={"ID":"b43ce753-6964-468d-b045-09b2e0135dcf","Type":"ContainerDied","Data":"4d8be0f4f5428c666acfbd118a283482c280baff9dd82fa296c914287bfc8b2c"} Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.062787 4755 generic.go:334] "Generic (PLEG): container finished" podID="f54c1112-6e1f-4c17-b207-4935524c6add" containerID="7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8" exitCode=0 Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.062841 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" event={"ID":"f54c1112-6e1f-4c17-b207-4935524c6add","Type":"ContainerDied","Data":"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8"} Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.062874 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" event={"ID":"f54c1112-6e1f-4c17-b207-4935524c6add","Type":"ContainerDied","Data":"bc780f4e8ccae6591963fee700946dc07533303c4d70e122d58b765e404e69ce"} Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.062966 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.080045 4755 scope.go:117] "RemoveContainer" containerID="a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.081750 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258\": container with ID starting with a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258 not found: ID does not exist" containerID="a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.081802 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258"} err="failed to get container status \"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258\": rpc error: code = NotFound desc = could not find container \"a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258\": container with ID starting with a8c59da7fab9f7c758a963d63490194d52f9256fd2d860bd5316cfdd4a368258 not found: ID does not exist" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.081834 4755 scope.go:117] "RemoveContainer" containerID="7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.100747 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.108281 4755 scope.go:117] "RemoveContainer" containerID="7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.109023 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8\": container with ID starting with 7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8 not found: ID does not exist" containerID="7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.109097 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8"} err="failed to get container status \"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8\": rpc error: code = NotFound desc = could not find container \"7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8\": container with ID starting with 7e4a3737765fc1c64dbd5752c4386be9375fde745252ee7b0861a07d2f7283e8 not found: ID does not exist" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.109863 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d784bd974-96gbc"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.118933 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.124588 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659697f79c-5zj25"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.259300 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43ce753-6964-468d-b045-09b2e0135dcf" path="/var/lib/kubelet/pods/b43ce753-6964-468d-b045-09b2e0135dcf/volumes" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.260280 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" path="/var/lib/kubelet/pods/e3bffbce-4004-4c41-9711-28fa7700e14a/volumes" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.262698 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54c1112-6e1f-4c17-b207-4935524c6add" path="/var/lib/kubelet/pods/f54c1112-6e1f-4c17-b207-4935524c6add/volumes" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764364 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54c1112-6e1f-4c17-b207-4935524c6add" containerName="route-controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764384 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54c1112-6e1f-4c17-b207-4935524c6add" containerName="route-controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764400 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43ce753-6964-468d-b045-09b2e0135dcf" containerName="controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764409 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43ce753-6964-468d-b045-09b2e0135dcf" containerName="controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764419 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764427 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764465 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764478 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764496 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764507 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764522 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764530 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764539 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764548 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764561 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764569 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764580 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764588 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="extract-utilities" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764599 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764606 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: E0317 00:25:54.764618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764626 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="extract-content" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764762 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1089c661-4f7b-40e2-8549-983b06c1409a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764783 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54c1112-6e1f-4c17-b207-4935524c6add" containerName="route-controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764795 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23b6da4-12f1-4104-93ba-dfc06d3572aa" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764805 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43ce753-6964-468d-b045-09b2e0135dcf" containerName="controller-manager" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.764822 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bffbce-4004-4c41-9711-28fa7700e14a" containerName="registry-server" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.765270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.767092 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.767225 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.767883 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.770106 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.770349 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.770932 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.771502 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.771635 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.772601 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.772671 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.772950 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.773065 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.773111 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.773125 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.780875 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.783365 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.793414 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.845865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnzh\" (UniqueName: \"kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.845910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.845942 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.845994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.846047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947585 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnzh\" (UniqueName: \"kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947807 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxsj\" (UniqueName: \"kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.947968 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.948032 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.949267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.949677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.950120 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.953303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:54 crc kubenswrapper[4755]: I0317 00:25:54.984119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnzh\" (UniqueName: \"kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh\") pod \"controller-manager-7858c5986-lc28s\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.049242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.049330 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxsj\" (UniqueName: \"kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.049368 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.049466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.050538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.051030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.053710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.080809 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxsj\" (UniqueName: \"kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj\") pod \"route-controller-manager-7c8f6c96-s4rvm\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.082389 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.088757 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.284783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:25:55 crc kubenswrapper[4755]: I0317 00:25:55.563250 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:25:55 crc kubenswrapper[4755]: W0317 00:25:55.568098 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85993df_62fb_4b25_9290_d9e7820a87ae.slice/crio-15de41ffe20e99300b25af16d742988a2be980685fa5f2b640c352316d0221fe WatchSource:0}: Error finding container 15de41ffe20e99300b25af16d742988a2be980685fa5f2b640c352316d0221fe: Status 404 returned error can't find the container with id 15de41ffe20e99300b25af16d742988a2be980685fa5f2b640c352316d0221fe Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.084038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" event={"ID":"b85993df-62fb-4b25-9290-d9e7820a87ae","Type":"ContainerStarted","Data":"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc"} Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.084117 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.084131 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" event={"ID":"b85993df-62fb-4b25-9290-d9e7820a87ae","Type":"ContainerStarted","Data":"15de41ffe20e99300b25af16d742988a2be980685fa5f2b640c352316d0221fe"} Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.085412 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" event={"ID":"fe174760-ba87-465e-8a99-77bd8fab4181","Type":"ContainerStarted","Data":"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d"} Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.085497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" event={"ID":"fe174760-ba87-465e-8a99-77bd8fab4181","Type":"ContainerStarted","Data":"4816c46c4dbde48ee4e1d66f33757645dd21083e1e83f20748814ff88887988c"} Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.085710 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.095914 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.119949 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" podStartSLOduration=3.11992162 podStartE2EDuration="3.11992162s" podCreationTimestamp="2026-03-17 00:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:56.116227632 +0000 UTC m=+230.875679955" watchObservedRunningTime="2026-03-17 00:25:56.11992162 +0000 UTC m=+230.879373943" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.141249 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" podStartSLOduration=4.141222978 podStartE2EDuration="4.141222978s" podCreationTimestamp="2026-03-17 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:25:56.137322166 +0000 UTC m=+230.896774459" watchObservedRunningTime="2026-03-17 00:25:56.141222978 +0000 UTC m=+230.900675271" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.192373 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:25:56 crc kubenswrapper[4755]: I0317 00:25:56.796800 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-grcxd"] Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.149931 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561786-84lbh"] Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.151578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.153421 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.154603 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.154837 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.162890 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561786-84lbh"] Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.325820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wkg4\" (UniqueName: \"kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4\") pod \"auto-csr-approver-29561786-84lbh\" (UID: \"a73f0adb-2ef2-4e25-91ba-f29aa35939bf\") " pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.427515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wkg4\" (UniqueName: \"kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4\") pod \"auto-csr-approver-29561786-84lbh\" (UID: \"a73f0adb-2ef2-4e25-91ba-f29aa35939bf\") " pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.455366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wkg4\" (UniqueName: \"kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4\") pod \"auto-csr-approver-29561786-84lbh\" (UID: \"a73f0adb-2ef2-4e25-91ba-f29aa35939bf\") " pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.493244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:00 crc kubenswrapper[4755]: I0317 00:26:00.998299 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561786-84lbh"] Mar 17 00:26:01 crc kubenswrapper[4755]: W0317 00:26:01.009762 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73f0adb_2ef2_4e25_91ba_f29aa35939bf.slice/crio-4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877 WatchSource:0}: Error finding container 4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877: Status 404 returned error can't find the container with id 4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877 Mar 17 00:26:01 crc kubenswrapper[4755]: I0317 00:26:01.117306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561786-84lbh" event={"ID":"a73f0adb-2ef2-4e25-91ba-f29aa35939bf","Type":"ContainerStarted","Data":"4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877"} Mar 17 00:26:03 crc kubenswrapper[4755]: I0317 00:26:03.129844 4755 generic.go:334] "Generic (PLEG): container finished" podID="a73f0adb-2ef2-4e25-91ba-f29aa35939bf" containerID="354d4e1b72a1530e0c66af0cb2a4ee896017daf25f188f1f009ef30c226bda25" exitCode=0 Mar 17 00:26:03 crc kubenswrapper[4755]: I0317 00:26:03.130102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561786-84lbh" event={"ID":"a73f0adb-2ef2-4e25-91ba-f29aa35939bf","Type":"ContainerDied","Data":"354d4e1b72a1530e0c66af0cb2a4ee896017daf25f188f1f009ef30c226bda25"} Mar 17 00:26:04 crc kubenswrapper[4755]: I0317 00:26:04.525147 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:04 crc kubenswrapper[4755]: I0317 00:26:04.689270 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wkg4\" (UniqueName: \"kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4\") pod \"a73f0adb-2ef2-4e25-91ba-f29aa35939bf\" (UID: \"a73f0adb-2ef2-4e25-91ba-f29aa35939bf\") " Mar 17 00:26:04 crc kubenswrapper[4755]: I0317 00:26:04.695932 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4" (OuterVolumeSpecName: "kube-api-access-6wkg4") pod "a73f0adb-2ef2-4e25-91ba-f29aa35939bf" (UID: "a73f0adb-2ef2-4e25-91ba-f29aa35939bf"). InnerVolumeSpecName "kube-api-access-6wkg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:26:04 crc kubenswrapper[4755]: I0317 00:26:04.790702 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wkg4\" (UniqueName: \"kubernetes.io/projected/a73f0adb-2ef2-4e25-91ba-f29aa35939bf-kube-api-access-6wkg4\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:05 crc kubenswrapper[4755]: I0317 00:26:05.147212 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561786-84lbh" event={"ID":"a73f0adb-2ef2-4e25-91ba-f29aa35939bf","Type":"ContainerDied","Data":"4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877"} Mar 17 00:26:05 crc kubenswrapper[4755]: I0317 00:26:05.147253 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6dbd7e32c5c97dcd3b7f86aa5b96817939852c84ef116b621a91da5f001877" Mar 17 00:26:05 crc kubenswrapper[4755]: I0317 00:26:05.147281 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561786-84lbh" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.108346 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.109143 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73f0adb-2ef2-4e25-91ba-f29aa35939bf" containerName="oc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.109158 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f0adb-2ef2-4e25-91ba-f29aa35939bf" containerName="oc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.109333 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73f0adb-2ef2-4e25-91ba-f29aa35939bf" containerName="oc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.109712 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110003 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d" gracePeriod=15 Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110128 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938" gracePeriod=15 Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110142 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110150 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a" gracePeriod=15 Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110072 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3" gracePeriod=15 Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.110113 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8" gracePeriod=15 Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111151 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111501 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111520 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111543 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111553 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111563 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111573 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111585 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111594 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111606 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111616 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111629 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111639 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111655 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111665 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111677 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111690 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111701 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111710 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111727 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111736 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.111752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111762 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111914 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111929 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111940 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111953 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111968 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111977 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111988 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.111997 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.112211 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.112461 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.149785 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282261 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282361 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.282536 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384002 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384121 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384148 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384214 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384308 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384316 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.384813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: I0317 00:26:11.444592 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:26:11 crc kubenswrapper[4755]: W0317 00:26:11.473239 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5761267dd9adb962b14ba08894f395a1c848a057e7ee1b744b6278d3feb5e1b3 WatchSource:0}: Error finding container 5761267dd9adb962b14ba08894f395a1c848a057e7ee1b744b6278d3feb5e1b3: Status 404 returned error can't find the container with id 5761267dd9adb962b14ba08894f395a1c848a057e7ee1b744b6278d3feb5e1b3 Mar 17 00:26:11 crc kubenswrapper[4755]: E0317 00:26:11.479178 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d79466386865d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,LastTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.199141 4755 generic.go:334] "Generic (PLEG): container finished" podID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" containerID="7dbddef2946342f733c6dfc2a879514f48551c9fcc9c66478931b938b8887107" exitCode=0 Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.199216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46","Type":"ContainerDied","Data":"7dbddef2946342f733c6dfc2a879514f48551c9fcc9c66478931b938b8887107"} Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.200085 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.200262 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.200525 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.201705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac"} Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.201745 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5761267dd9adb962b14ba08894f395a1c848a057e7ee1b744b6278d3feb5e1b3"} Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.202872 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.203096 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.203336 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.205161 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.206834 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.208088 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a" exitCode=0 Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.208115 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3" exitCode=0 Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.208125 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8" exitCode=0 Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.208136 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938" exitCode=2 Mar 17 00:26:12 crc kubenswrapper[4755]: I0317 00:26:12.208172 4755 scope.go:117] "RemoveContainer" containerID="f6e1924f72466c5a715b647b65c604a73f439db758de92b9300bc0ba3248dcc1" Mar 17 00:26:12 crc kubenswrapper[4755]: E0317 00:26:12.350632 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d79466386865d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,LastTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.229309 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.578975 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.579722 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.580526 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.580696 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.580880 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.624971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.625061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.625088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.625244 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.625266 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.625271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.714227 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.715588 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.716389 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.717338 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726218 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access\") pod \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726328 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock\") pod \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726380 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir\") pod \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\" (UID: \"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46\") " Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" (UID: "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726496 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock" (OuterVolumeSpecName: "var-lock") pod "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" (UID: "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726823 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726856 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726873 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726889 4755 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.726904 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.734904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" (UID: "3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:26:13 crc kubenswrapper[4755]: I0317 00:26:13.827994 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.261946 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.267766 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46","Type":"ContainerDied","Data":"2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5"} Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.267835 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2126d963ef6348b8103b3af3071ca19a53f0cf0307020f885e88751520863cc5" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.271238 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.273139 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.275143 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d" exitCode=0 Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.275204 4755 scope.go:117] "RemoveContainer" containerID="4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.275368 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.276210 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.276674 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.277636 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.287542 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.287965 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.288729 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.298179 4755 scope.go:117] "RemoveContainer" containerID="dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.306138 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.306652 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.307120 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.327930 4755 scope.go:117] "RemoveContainer" containerID="c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.347872 4755 scope.go:117] "RemoveContainer" containerID="00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.363871 4755 scope.go:117] "RemoveContainer" containerID="740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.385933 4755 scope.go:117] "RemoveContainer" containerID="581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.404125 4755 scope.go:117] "RemoveContainer" containerID="4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.404571 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a\": container with ID starting with 4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a not found: ID does not exist" containerID="4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.404620 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a"} err="failed to get container status \"4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a\": rpc error: code = NotFound desc = could not find container \"4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a\": container with ID starting with 4f70d40b409b557ad33856454d01508d4fc1c0a77374c39afdd2ad38aca0a53a not found: ID does not exist" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.404654 4755 scope.go:117] "RemoveContainer" containerID="dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.405020 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\": container with ID starting with dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3 not found: ID does not exist" containerID="dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.405092 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3"} err="failed to get container status \"dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\": rpc error: code = NotFound desc = could not find container \"dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3\": container with ID starting with dc25d5af48ed998aac62415a5b4c67c9a398c432f9c533450b2161daf671f4c3 not found: ID does not exist" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.405826 4755 scope.go:117] "RemoveContainer" containerID="c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.406230 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\": container with ID starting with c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8 not found: ID does not exist" containerID="c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.406267 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8"} err="failed to get container status \"c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\": rpc error: code = NotFound desc = could not find container \"c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8\": container with ID starting with c5e26ddcb1cf887b00eda05830bb8c9b147827972aa94ea103443dc3b3acaac8 not found: ID does not exist" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.406291 4755 scope.go:117] "RemoveContainer" containerID="00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.406564 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\": container with ID starting with 00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938 not found: ID does not exist" containerID="00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.406592 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938"} err="failed to get container status \"00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\": rpc error: code = NotFound desc = could not find container \"00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938\": container with ID starting with 00d70a07f23db5f9158cc8e7e272cfdf5f9980202c2c6b1255bda5826a0aa938 not found: ID does not exist" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.406613 4755 scope.go:117] "RemoveContainer" containerID="740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.406989 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\": container with ID starting with 740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d not found: ID does not exist" containerID="740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.407037 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d"} err="failed to get container status \"740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\": rpc error: code = NotFound desc = could not find container \"740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d\": container with ID starting with 740add017b6d70d25ff586014f5a322d07619022ad203c2e4a36e09396feeb2d not found: ID does not exist" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.407114 4755 scope.go:117] "RemoveContainer" containerID="581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765" Mar 17 00:26:14 crc kubenswrapper[4755]: E0317 00:26:14.407632 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\": container with ID starting with 581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765 not found: ID does not exist" containerID="581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765" Mar 17 00:26:14 crc kubenswrapper[4755]: I0317 00:26:14.407666 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765"} err="failed to get container status \"581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\": rpc error: code = NotFound desc = could not find container \"581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765\": container with ID starting with 581c950e6074960d5982acfd847b0e81526bbb51931c9a86abfb18e379210765 not found: ID does not exist" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.144914 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.146627 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.147392 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.148082 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.148622 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: I0317 00:26:16.148682 4755 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.149127 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 17 00:26:16 crc kubenswrapper[4755]: I0317 00:26:16.256494 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: I0317 00:26:16.257860 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: I0317 00:26:16.259811 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.350625 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 17 00:26:16 crc kubenswrapper[4755]: E0317 00:26:16.751227 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 17 00:26:17 crc kubenswrapper[4755]: E0317 00:26:17.551762 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 17 00:26:19 crc kubenswrapper[4755]: E0317 00:26:19.153424 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 17 00:26:21 crc kubenswrapper[4755]: I0317 00:26:21.828253 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerName="oauth-openshift" containerID="cri-o://2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69" gracePeriod=15 Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.247878 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.249194 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.249407 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.328207 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.328260 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:22 crc kubenswrapper[4755]: E0317 00:26:22.328650 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.329117 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.334633 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.335381 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.335847 4755 status_manager.go:851] "Failed to get status for pod" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-grcxd\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.336301 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.336463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" event={"ID":"50008856-a0a3-4ec3-a48f-5f90891d777e","Type":"ContainerDied","Data":"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69"} Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.336514 4755 scope.go:117] "RemoveContainer" containerID="2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.336456 4755 generic.go:334] "Generic (PLEG): container finished" podID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerID="2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69" exitCode=0 Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.336611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" event={"ID":"50008856-a0a3-4ec3-a48f-5f90891d777e","Type":"ContainerDied","Data":"a6b0e51e3183daba6fa35cede43212e55802eb61c7dfbd0e017fb3c350a71733"} Mar 17 00:26:22 crc kubenswrapper[4755]: E0317 00:26:22.351567 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d79466386865d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,LastTimestamp:2026-03-17 00:26:11.478275677 +0000 UTC m=+246.237727960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 00:26:22 crc kubenswrapper[4755]: E0317 00:26:22.355272 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="6.4s" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.371812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.371876 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.371915 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.371949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.371982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372194 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnb4f\" (UniqueName: \"kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372226 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372255 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372341 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error\") pod \"50008856-a0a3-4ec3-a48f-5f90891d777e\" (UID: \"50008856-a0a3-4ec3-a48f-5f90891d777e\") " Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.372812 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.373944 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.374745 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.374774 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.374792 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.380932 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.381588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.381911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.382806 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f" (OuterVolumeSpecName: "kube-api-access-rnb4f") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "kube-api-access-rnb4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.382997 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.383390 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.383801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.384068 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.384456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "50008856-a0a3-4ec3-a48f-5f90891d777e" (UID: "50008856-a0a3-4ec3-a48f-5f90891d777e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.403983 4755 scope.go:117] "RemoveContainer" containerID="2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69" Mar 17 00:26:22 crc kubenswrapper[4755]: E0317 00:26:22.405951 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69\": container with ID starting with 2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69 not found: ID does not exist" containerID="2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.406018 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69"} err="failed to get container status \"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69\": rpc error: code = NotFound desc = could not find container \"2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69\": container with ID starting with 2a361bb938a508a241f0c50e42a3467ccee738200a2291e0723a61ed96385e69 not found: ID does not exist" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473517 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473809 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473823 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473837 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473849 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnb4f\" (UniqueName: \"kubernetes.io/projected/50008856-a0a3-4ec3-a48f-5f90891d777e-kube-api-access-rnb4f\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473860 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473872 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473884 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473898 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473912 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473925 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473938 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473950 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/50008856-a0a3-4ec3-a48f-5f90891d777e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:22 crc kubenswrapper[4755]: I0317 00:26:22.473961 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/50008856-a0a3-4ec3-a48f-5f90891d777e-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.345057 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.346583 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.347068 4755 status_manager.go:851] "Failed to get status for pod" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-grcxd\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.347565 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.347928 4755 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="db8a2033e55a6f91c88d72dbb086fbf3a7af99ac2ff68a5412d931a3ed8b18cd" exitCode=0 Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.347999 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"db8a2033e55a6f91c88d72dbb086fbf3a7af99ac2ff68a5412d931a3ed8b18cd"} Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.348063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"997a202c80a011132c1f81bc2aa70a725cc6b60bae1dbb8884ff92df6bcaaa76"} Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.348660 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.348705 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:23 crc kubenswrapper[4755]: E0317 00:26:23.349166 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.349160 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.349947 4755 status_manager.go:851] "Failed to get status for pod" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-grcxd\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.350483 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.378636 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.379422 4755 status_manager.go:851] "Failed to get status for pod" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:23 crc kubenswrapper[4755]: I0317 00:26:23.380095 4755 status_manager.go:851] "Failed to get status for pod" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" pod="openshift-authentication/oauth-openshift-558db77b4-grcxd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-grcxd\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.355397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ce969b16d0ac1291266bc56b3222cdcfa161d6e516581f474e6062274707085"} Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.355706 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17b8d252ece26086f8fc026945c541548bb8c4dee9647150d48af813ef79a4c6"} Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.355717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7002462dbb6b549156535ce39962a8e0440cabf5c841647ddc50923efaeafd06"} Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.358286 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.359035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.359287 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="49d422659d1b99f78b7721ab1b1e41b8486b2b951987139a77ff415e1249c051" exitCode=1 Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.359344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"49d422659d1b99f78b7721ab1b1e41b8486b2b951987139a77ff415e1249c051"} Mar 17 00:26:24 crc kubenswrapper[4755]: I0317 00:26:24.360061 4755 scope.go:117] "RemoveContainer" containerID="49d422659d1b99f78b7721ab1b1e41b8486b2b951987139a77ff415e1249c051" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.373938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"28f2492240f44950b13ab5cc7b1c49be4b45046dfe51b0f9d0a092696a0b5cdf"} Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.374313 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.374327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93230ecba5709ed03e21bb67a60267604367c62b1cbe785d1b4f2316729056c5"} Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.374551 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.374594 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.378504 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.379283 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 00:26:25 crc kubenswrapper[4755]: I0317 00:26:25.379347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"786825c8418d9d88774621c14206aa5b6d3158460faf531e089ce26665f1587c"} Mar 17 00:26:27 crc kubenswrapper[4755]: I0317 00:26:27.329624 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:27 crc kubenswrapper[4755]: I0317 00:26:27.329667 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:27 crc kubenswrapper[4755]: I0317 00:26:27.372170 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:28 crc kubenswrapper[4755]: I0317 00:26:28.665123 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:26:28 crc kubenswrapper[4755]: I0317 00:26:28.665623 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.385792 4755 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.414221 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.414260 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.421224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.425881 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="326ce4f3-5bff-4356-b01a-c12178007e3f" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.570165 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.570683 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 17 00:26:30 crc kubenswrapper[4755]: I0317 00:26:30.570739 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 17 00:26:31 crc kubenswrapper[4755]: I0317 00:26:31.419414 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:31 crc kubenswrapper[4755]: I0317 00:26:31.419469 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:32 crc kubenswrapper[4755]: I0317 00:26:32.407553 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:26:36 crc kubenswrapper[4755]: I0317 00:26:36.267150 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="326ce4f3-5bff-4356-b01a-c12178007e3f" Mar 17 00:26:38 crc kubenswrapper[4755]: I0317 00:26:38.376075 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 00:26:38 crc kubenswrapper[4755]: I0317 00:26:38.916319 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 00:26:39 crc kubenswrapper[4755]: I0317 00:26:39.202994 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 00:26:40 crc kubenswrapper[4755]: I0317 00:26:40.570625 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 17 00:26:40 crc kubenswrapper[4755]: I0317 00:26:40.570705 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 17 00:26:41 crc kubenswrapper[4755]: I0317 00:26:41.264274 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 00:26:41 crc kubenswrapper[4755]: I0317 00:26:41.361123 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 00:26:41 crc kubenswrapper[4755]: I0317 00:26:41.522334 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 00:26:41 crc kubenswrapper[4755]: I0317 00:26:41.979315 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 00:26:42 crc kubenswrapper[4755]: I0317 00:26:42.857728 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 00:26:42 crc kubenswrapper[4755]: I0317 00:26:42.915930 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 00:26:42 crc kubenswrapper[4755]: I0317 00:26:42.980313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 00:26:43 crc kubenswrapper[4755]: I0317 00:26:43.642623 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 00:26:43 crc kubenswrapper[4755]: I0317 00:26:43.717038 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 00:26:43 crc kubenswrapper[4755]: I0317 00:26:43.784585 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:26:43 crc kubenswrapper[4755]: I0317 00:26:43.939849 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 00:26:43 crc kubenswrapper[4755]: I0317 00:26:43.941057 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.031784 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.089844 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.148988 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.158732 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.231516 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.264961 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.329921 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.366133 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.369913 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.465518 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.550086 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.790018 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 00:26:44 crc kubenswrapper[4755]: I0317 00:26:44.988763 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.017634 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.240575 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.280653 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.376331 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.384673 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.550670 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.554598 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.564683 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.672327 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.673644 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.721080 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.793793 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.796111 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 00:26:45 crc kubenswrapper[4755]: I0317 00:26:45.874058 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.145768 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.164878 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.207285 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.297034 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.315310 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.376199 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.464209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.548697 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.654558 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.683812 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 00:26:46 crc kubenswrapper[4755]: I0317 00:26:46.864806 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.001889 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.002692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.063537 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.064711 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.120985 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.180103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.269120 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.380709 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.383262 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.400227 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.410823 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.444542 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.495589 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.556167 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.601007 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.634175 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.673643 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.745110 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.819666 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.892015 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 00:26:47 crc kubenswrapper[4755]: I0317 00:26:47.937640 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.048188 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.097073 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.098205 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.117547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.162944 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.165613 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.207582 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.285567 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.285783 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.333785 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.418914 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.443654 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.448855 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.540429 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.575933 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.586236 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.642872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.674989 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.791078 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.908484 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.933041 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.978991 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 00:26:48 crc kubenswrapper[4755]: I0317 00:26:48.985538 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.001739 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.066307 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.110103 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.123580 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.148338 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.230588 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.231231 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.252035 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.615656 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.634357 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.837969 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.854936 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.910194 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 00:26:49 crc kubenswrapper[4755]: I0317 00:26:49.967358 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.048524 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.329028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.348862 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.368914 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.415039 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.570392 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.570530 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.570609 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.571673 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"786825c8418d9d88774621c14206aa5b6d3158460faf531e089ce26665f1587c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.571898 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://786825c8418d9d88774621c14206aa5b6d3158460faf531e089ce26665f1587c" gracePeriod=30 Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.595391 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.686828 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.741358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.753919 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.759948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.780118 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.829559 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.877724 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.890895 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 00:26:50 crc kubenswrapper[4755]: I0317 00:26:50.980890 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.111224 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.123643 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.194479 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.212883 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.276367 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.327124 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.512084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.565602 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.575139 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.581270 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.589871 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.615003 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.779991 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.792428 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.794135 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.858284 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.885982 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.887481 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.892527 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 00:26:51 crc kubenswrapper[4755]: I0317 00:26:51.954851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.008719 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.071508 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.112752 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.124110 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.174529 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.209519 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.299844 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.303306 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.323035 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.372179 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.432374 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.471272 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.527834 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.541462 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.682471 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.775739 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.814600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.816951 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.878944 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.881668 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 00:26:52 crc kubenswrapper[4755]: I0317 00:26:52.894052 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.003841 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.444580 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.457516 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.532533 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.549578 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.568505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.579027 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.624740 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.665123 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.709307 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.754792 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.756565 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.797262 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 00:26:53 crc kubenswrapper[4755]: I0317 00:26:53.837079 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.045505 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.109114 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.184373 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.215261 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.220506 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.236141 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.259475 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.320394 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.329422 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.518592 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.522977 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.568261 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.932482 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.975654 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.980407 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 00:26:54 crc kubenswrapper[4755]: I0317 00:26:54.988775 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.154781 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.200799 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.400829 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.459922 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.510643 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.522080 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.523748 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.539198 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.565943 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.679824 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.865790 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.892763 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 00:26:55 crc kubenswrapper[4755]: I0317 00:26:55.990171 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.161599 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.270127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.305389 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.559375 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.583565 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.845822 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.896984 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.905358 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.937017 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 00:26:56 crc kubenswrapper[4755]: I0317 00:26:56.969568 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.206956 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.296530 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.375862 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.393004 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.470655 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.568248 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.595988 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 00:26:57 crc kubenswrapper[4755]: I0317 00:26:57.728796 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.046952 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.054159 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.054132793 podStartE2EDuration="47.054132793s" podCreationTimestamp="2026-03-17 00:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:26:30.118515518 +0000 UTC m=+264.877967841" watchObservedRunningTime="2026-03-17 00:26:58.054132793 +0000 UTC m=+292.813585106" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.054962 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-grcxd"] Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.055036 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp","openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.055735 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.055770 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94eb6cf8-35a8-49fc-acc6-92cab54f2710" Mar 17 00:26:58 crc kubenswrapper[4755]: E0317 00:26:58.060767 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerName="oauth-openshift" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.060815 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerName="oauth-openshift" Mar 17 00:26:58 crc kubenswrapper[4755]: E0317 00:26:58.060839 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" containerName="installer" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.060854 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" containerName="installer" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.061121 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c703cb5-e96c-4bb0-a6d3-4e4c2a3cfb46" containerName="installer" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.061154 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" containerName="oauth-openshift" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.062039 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.062488 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.065794 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.070406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.071471 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072651 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072788 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072928 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072938 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072934 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072966 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.072947 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.073195 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.083380 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.089718 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.098704 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119127 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5x7\" (UniqueName: \"kubernetes.io/projected/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-kube-api-access-vj5x7\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119315 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-login\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-error\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119476 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-policies\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119516 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-session\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119521 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.119505638 podStartE2EDuration="28.119505638s" podCreationTimestamp="2026-03-17 00:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:26:58.117282011 +0000 UTC m=+292.876734294" watchObservedRunningTime="2026-03-17 00:26:58.119505638 +0000 UTC m=+292.878957921" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119549 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-dir\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.119786 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.121546 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.220522 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5x7\" (UniqueName: \"kubernetes.io/projected/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-kube-api-access-vj5x7\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.220585 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.220629 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.220655 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-login\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.220725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-error\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-policies\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-policies\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-session\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-dir\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221773 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221783 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-audit-dir\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.221798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.222187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.222660 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.223174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.224666 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.227224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.227222 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-login\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.227304 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.227512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-error\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.228075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.229405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.229589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-session\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.232723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.238790 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5x7\" (UniqueName: \"kubernetes.io/projected/ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4-kube-api-access-vj5x7\") pod \"oauth-openshift-6cf47d78cb-c6lfp\" (UID: \"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4\") " pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.257083 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50008856-a0a3-4ec3-a48f-5f90891d777e" path="/var/lib/kubelet/pods/50008856-a0a3-4ec3-a48f-5f90891d777e/volumes" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.260609 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.390661 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.533396 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.665293 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.665606 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:26:58 crc kubenswrapper[4755]: I0317 00:26:58.890793 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp"] Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.457190 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.631043 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" event={"ID":"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4","Type":"ContainerStarted","Data":"ccaa384f1365f9dbf451a0ef89e220cb4a037589b72746c09984385c0095e160"} Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.631114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" event={"ID":"ad4516c5-5aff-4f51-9ac3-0a7f523d0cf4","Type":"ContainerStarted","Data":"906b4f36f19c5e4f5fd471fb5746173b78724d62a44b0be52ecbdafa930f2427"} Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.671383 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" podStartSLOduration=63.671352709 podStartE2EDuration="1m3.671352709s" podCreationTimestamp="2026-03-17 00:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:26:59.667601428 +0000 UTC m=+294.427053761" watchObservedRunningTime="2026-03-17 00:26:59.671352709 +0000 UTC m=+294.430805022" Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.807289 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 00:26:59 crc kubenswrapper[4755]: I0317 00:26:59.937576 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 00:27:00 crc kubenswrapper[4755]: I0317 00:27:00.455716 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 00:27:00 crc kubenswrapper[4755]: I0317 00:27:00.638918 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:27:00 crc kubenswrapper[4755]: I0317 00:27:00.648142 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cf47d78cb-c6lfp" Mar 17 00:27:04 crc kubenswrapper[4755]: I0317 00:27:04.167640 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 00:27:04 crc kubenswrapper[4755]: I0317 00:27:04.168571 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac" gracePeriod=5 Mar 17 00:27:09 crc kubenswrapper[4755]: E0317 00:27:09.312063 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-conmon-954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.705350 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.705427 4755 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac" exitCode=137 Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.774394 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.774819 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896471 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896527 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896789 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896837 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.896976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.897132 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.897148 4755 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.897159 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.897169 4755 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.908373 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:27:09 crc kubenswrapper[4755]: I0317 00:27:09.997993 4755 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.263000 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.263409 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.275050 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.275097 4755 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e2a946a-69b1-418b-aa2c-6de0933377d5" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.279603 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.279650 4755 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e2a946a-69b1-418b-aa2c-6de0933377d5" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.716428 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.716702 4755 scope.go:117] "RemoveContainer" containerID="954dd0604dee8a306459868cde4956c58a8be7b07dc761671f6bba25848500ac" Mar 17 00:27:10 crc kubenswrapper[4755]: I0317 00:27:10.716891 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.790774 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.794390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.795582 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.795663 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="786825c8418d9d88774621c14206aa5b6d3158460faf531e089ce26665f1587c" exitCode=137 Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.795722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"786825c8418d9d88774621c14206aa5b6d3158460faf531e089ce26665f1587c"} Mar 17 00:27:20 crc kubenswrapper[4755]: I0317 00:27:20.795802 4755 scope.go:117] "RemoveContainer" containerID="49d422659d1b99f78b7721ab1b1e41b8486b2b951987139a77ff415e1249c051" Mar 17 00:27:22 crc kubenswrapper[4755]: I0317 00:27:22.712127 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 17 00:27:22 crc kubenswrapper[4755]: I0317 00:27:22.713149 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 00:27:22 crc kubenswrapper[4755]: I0317 00:27:22.713194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f28821068e656b5ca4a19c0c84f5de01f80020e5664eb49b470966a13aad183c"} Mar 17 00:27:28 crc kubenswrapper[4755]: I0317 00:27:28.664888 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:27:28 crc kubenswrapper[4755]: I0317 00:27:28.665575 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:27:28 crc kubenswrapper[4755]: I0317 00:27:28.665645 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:27:28 crc kubenswrapper[4755]: I0317 00:27:28.666602 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:27:28 crc kubenswrapper[4755]: I0317 00:27:28.666683 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74" gracePeriod=600 Mar 17 00:27:29 crc kubenswrapper[4755]: I0317 00:27:29.762781 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74" exitCode=0 Mar 17 00:27:29 crc kubenswrapper[4755]: I0317 00:27:29.762869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74"} Mar 17 00:27:29 crc kubenswrapper[4755]: I0317 00:27:29.763368 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e"} Mar 17 00:27:30 crc kubenswrapper[4755]: I0317 00:27:30.570492 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:27:30 crc kubenswrapper[4755]: I0317 00:27:30.575055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:27:30 crc kubenswrapper[4755]: I0317 00:27:30.779012 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:27:31 crc kubenswrapper[4755]: I0317 00:27:31.789409 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.182200 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.182986 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" podUID="fe174760-ba87-465e-8a99-77bd8fab4181" containerName="controller-manager" containerID="cri-o://a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d" gracePeriod=30 Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.197852 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.198075 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" podUID="b85993df-62fb-4b25-9290-d9e7820a87ae" containerName="route-controller-manager" containerID="cri-o://75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc" gracePeriod=30 Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.596281 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.661026 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca\") pod \"fe174760-ba87-465e-8a99-77bd8fab4181\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.661094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert\") pod \"fe174760-ba87-465e-8a99-77bd8fab4181\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.661146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config\") pod \"fe174760-ba87-465e-8a99-77bd8fab4181\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.661167 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles\") pod \"fe174760-ba87-465e-8a99-77bd8fab4181\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.661203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnzh\" (UniqueName: \"kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh\") pod \"fe174760-ba87-465e-8a99-77bd8fab4181\" (UID: \"fe174760-ba87-465e-8a99-77bd8fab4181\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.662209 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config" (OuterVolumeSpecName: "config") pod "fe174760-ba87-465e-8a99-77bd8fab4181" (UID: "fe174760-ba87-465e-8a99-77bd8fab4181"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.662276 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca" (OuterVolumeSpecName: "client-ca") pod "fe174760-ba87-465e-8a99-77bd8fab4181" (UID: "fe174760-ba87-465e-8a99-77bd8fab4181"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.662377 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fe174760-ba87-465e-8a99-77bd8fab4181" (UID: "fe174760-ba87-465e-8a99-77bd8fab4181"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.665537 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.666307 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fe174760-ba87-465e-8a99-77bd8fab4181" (UID: "fe174760-ba87-465e-8a99-77bd8fab4181"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.668126 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh" (OuterVolumeSpecName: "kube-api-access-nfnzh") pod "fe174760-ba87-465e-8a99-77bd8fab4181" (UID: "fe174760-ba87-465e-8a99-77bd8fab4181"). InnerVolumeSpecName "kube-api-access-nfnzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762565 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config\") pod \"b85993df-62fb-4b25-9290-d9e7820a87ae\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762607 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwxsj\" (UniqueName: \"kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj\") pod \"b85993df-62fb-4b25-9290-d9e7820a87ae\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca\") pod \"b85993df-62fb-4b25-9290-d9e7820a87ae\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762655 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert\") pod \"b85993df-62fb-4b25-9290-d9e7820a87ae\" (UID: \"b85993df-62fb-4b25-9290-d9e7820a87ae\") " Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762824 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762835 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe174760-ba87-465e-8a99-77bd8fab4181-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762843 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762852 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe174760-ba87-465e-8a99-77bd8fab4181-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.762862 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnzh\" (UniqueName: \"kubernetes.io/projected/fe174760-ba87-465e-8a99-77bd8fab4181-kube-api-access-nfnzh\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.763921 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "b85993df-62fb-4b25-9290-d9e7820a87ae" (UID: "b85993df-62fb-4b25-9290-d9e7820a87ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.764125 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config" (OuterVolumeSpecName: "config") pod "b85993df-62fb-4b25-9290-d9e7820a87ae" (UID: "b85993df-62fb-4b25-9290-d9e7820a87ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.765708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b85993df-62fb-4b25-9290-d9e7820a87ae" (UID: "b85993df-62fb-4b25-9290-d9e7820a87ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.766033 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj" (OuterVolumeSpecName: "kube-api-access-fwxsj") pod "b85993df-62fb-4b25-9290-d9e7820a87ae" (UID: "b85993df-62fb-4b25-9290-d9e7820a87ae"). InnerVolumeSpecName "kube-api-access-fwxsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.834604 4755 generic.go:334] "Generic (PLEG): container finished" podID="fe174760-ba87-465e-8a99-77bd8fab4181" containerID="a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d" exitCode=0 Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.834659 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.834707 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" event={"ID":"fe174760-ba87-465e-8a99-77bd8fab4181","Type":"ContainerDied","Data":"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d"} Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.834787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7858c5986-lc28s" event={"ID":"fe174760-ba87-465e-8a99-77bd8fab4181","Type":"ContainerDied","Data":"4816c46c4dbde48ee4e1d66f33757645dd21083e1e83f20748814ff88887988c"} Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.834818 4755 scope.go:117] "RemoveContainer" containerID="a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.839356 4755 generic.go:334] "Generic (PLEG): container finished" podID="b85993df-62fb-4b25-9290-d9e7820a87ae" containerID="75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc" exitCode=0 Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.839412 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" event={"ID":"b85993df-62fb-4b25-9290-d9e7820a87ae","Type":"ContainerDied","Data":"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc"} Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.839568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" event={"ID":"b85993df-62fb-4b25-9290-d9e7820a87ae","Type":"ContainerDied","Data":"15de41ffe20e99300b25af16d742988a2be980685fa5f2b640c352316d0221fe"} Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.839645 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.858388 4755 scope.go:117] "RemoveContainer" containerID="a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d" Mar 17 00:27:39 crc kubenswrapper[4755]: E0317 00:27:39.858844 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d\": container with ID starting with a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d not found: ID does not exist" containerID="a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.858887 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d"} err="failed to get container status \"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d\": rpc error: code = NotFound desc = could not find container \"a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d\": container with ID starting with a31c80f4abe14d68c38307d9a6bfea307fd445e6d6cca6ff279a64ad70d7ed6d not found: ID does not exist" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.858913 4755 scope.go:117] "RemoveContainer" containerID="75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.865132 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.865156 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwxsj\" (UniqueName: \"kubernetes.io/projected/b85993df-62fb-4b25-9290-d9e7820a87ae-kube-api-access-fwxsj\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.865165 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85993df-62fb-4b25-9290-d9e7820a87ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.865174 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85993df-62fb-4b25-9290-d9e7820a87ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.866810 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.871517 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7858c5986-lc28s"] Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.878374 4755 scope.go:117] "RemoveContainer" containerID="75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc" Mar 17 00:27:39 crc kubenswrapper[4755]: E0317 00:27:39.878943 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc\": container with ID starting with 75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc not found: ID does not exist" containerID="75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.878980 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc"} err="failed to get container status \"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc\": rpc error: code = NotFound desc = could not find container \"75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc\": container with ID starting with 75041fbcb5056499025bd63707c7362af7702b597d67244d6e49a1b80a33c0bc not found: ID does not exist" Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.880356 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:27:39 crc kubenswrapper[4755]: I0317 00:27:39.897073 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8f6c96-s4rvm"] Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.256079 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85993df-62fb-4b25-9290-d9e7820a87ae" path="/var/lib/kubelet/pods/b85993df-62fb-4b25-9290-d9e7820a87ae/volumes" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.256788 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe174760-ba87-465e-8a99-77bd8fab4181" path="/var/lib/kubelet/pods/fe174760-ba87-465e-8a99-77bd8fab4181/volumes" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.723337 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:27:40 crc kubenswrapper[4755]: E0317 00:27:40.724175 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724219 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 00:27:40 crc kubenswrapper[4755]: E0317 00:27:40.724249 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe174760-ba87-465e-8a99-77bd8fab4181" containerName="controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724267 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe174760-ba87-465e-8a99-77bd8fab4181" containerName="controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: E0317 00:27:40.724294 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85993df-62fb-4b25-9290-d9e7820a87ae" containerName="route-controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724311 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85993df-62fb-4b25-9290-d9e7820a87ae" containerName="route-controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724570 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85993df-62fb-4b25-9290-d9e7820a87ae" containerName="route-controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724600 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe174760-ba87-465e-8a99-77bd8fab4181" containerName="controller-manager" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.724617 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.725204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.727510 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.728294 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.733027 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.733990 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734009 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734195 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734247 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734414 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734647 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734693 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734875 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.734876 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.735077 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.740980 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.742267 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.747850 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776455 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776529 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njrpg\" (UniqueName: \"kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776737 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwmf\" (UniqueName: \"kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.776881 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwmf\" (UniqueName: \"kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877564 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njrpg\" (UniqueName: \"kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877593 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.877801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.878371 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.878377 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.878789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.879734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.883132 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.883147 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.883967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.895667 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwmf\" (UniqueName: \"kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf\") pod \"route-controller-manager-647f6fbcc5-b4qb9\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:40 crc kubenswrapper[4755]: I0317 00:27:40.905324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njrpg\" (UniqueName: \"kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg\") pod \"controller-manager-bd9d8574d-4flnm\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.058913 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.076360 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.249090 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:27:41 crc kubenswrapper[4755]: W0317 00:27:41.256164 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod272fb1e6_4011_4703_ab0f_6985bd10bac7.slice/crio-61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20 WatchSource:0}: Error finding container 61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20: Status 404 returned error can't find the container with id 61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20 Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.292139 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:27:41 crc kubenswrapper[4755]: W0317 00:27:41.304000 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85a22d9_d3c4_4c6a_8646_9337230979b5.slice/crio-102c3c69f432670a04031155d19ceeb46429c1ebfe239bfe8a32af215d133d8c WatchSource:0}: Error finding container 102c3c69f432670a04031155d19ceeb46429c1ebfe239bfe8a32af215d133d8c: Status 404 returned error can't find the container with id 102c3c69f432670a04031155d19ceeb46429c1ebfe239bfe8a32af215d133d8c Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.852137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" event={"ID":"272fb1e6-4011-4703-ab0f-6985bd10bac7","Type":"ContainerStarted","Data":"c77a2693c8b2f4db6f385019aa4755e0e81742d0ac28b0a839ffa1c32e9fc2b4"} Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.852176 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" event={"ID":"272fb1e6-4011-4703-ab0f-6985bd10bac7","Type":"ContainerStarted","Data":"61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20"} Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.852778 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.853949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" event={"ID":"b85a22d9-d3c4-4c6a-8646-9337230979b5","Type":"ContainerStarted","Data":"3e09f4345ce1d004faa025c52fbadfd1eb08ed6344cc2ae5a164ffb88cc6a6d0"} Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.853975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" event={"ID":"b85a22d9-d3c4-4c6a-8646-9337230979b5","Type":"ContainerStarted","Data":"102c3c69f432670a04031155d19ceeb46429c1ebfe239bfe8a32af215d133d8c"} Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.854214 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.859025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.875986 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" podStartSLOduration=2.875966107 podStartE2EDuration="2.875966107s" podCreationTimestamp="2026-03-17 00:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:27:41.873333974 +0000 UTC m=+336.632786267" watchObservedRunningTime="2026-03-17 00:27:41.875966107 +0000 UTC m=+336.635418390" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.888893 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" podStartSLOduration=2.888876207 podStartE2EDuration="2.888876207s" podCreationTimestamp="2026-03-17 00:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:27:41.888475926 +0000 UTC m=+336.647928209" watchObservedRunningTime="2026-03-17 00:27:41.888876207 +0000 UTC m=+336.648328490" Mar 17 00:27:41 crc kubenswrapper[4755]: I0317 00:27:41.966604 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.033387 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mrlv2"] Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.034778 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.050778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mrlv2"] Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-certificates\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145238 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-tls\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145262 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/271806f0-526e-47fa-a0e5-e15ed5ef956b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145371 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-bound-sa-token\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-trusted-ca\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/271806f0-526e-47fa-a0e5-e15ed5ef956b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145488 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5nfn\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-kube-api-access-t5nfn\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.145560 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.187546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-tls\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246651 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/271806f0-526e-47fa-a0e5-e15ed5ef956b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-bound-sa-token\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-trusted-ca\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/271806f0-526e-47fa-a0e5-e15ed5ef956b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246820 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5nfn\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-kube-api-access-t5nfn\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.246909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-certificates\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.247880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-trusted-ca\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.247982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/271806f0-526e-47fa-a0e5-e15ed5ef956b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.248488 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-certificates\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.252416 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/271806f0-526e-47fa-a0e5-e15ed5ef956b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.252943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-registry-tls\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.260884 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-bound-sa-token\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.262609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5nfn\" (UniqueName: \"kubernetes.io/projected/271806f0-526e-47fa-a0e5-e15ed5ef956b-kube-api-access-t5nfn\") pod \"image-registry-66df7c8f76-mrlv2\" (UID: \"271806f0-526e-47fa-a0e5-e15ed5ef956b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.356737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.774299 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mrlv2"] Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.963223 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" event={"ID":"271806f0-526e-47fa-a0e5-e15ed5ef956b","Type":"ContainerStarted","Data":"c4a97aebe66f0c73f7ab4cd58c28bb617cabf7b4161264706eea806864366680"} Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.963272 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" event={"ID":"271806f0-526e-47fa-a0e5-e15ed5ef956b","Type":"ContainerStarted","Data":"39346fed12031215c39ad3360a2d4f7a567989711fe8834fe8c3faf36f2b8f2f"} Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.964186 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:27:59 crc kubenswrapper[4755]: I0317 00:27:59.991925 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" podStartSLOduration=0.991905282 podStartE2EDuration="991.905282ms" podCreationTimestamp="2026-03-17 00:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:27:59.988005053 +0000 UTC m=+354.747457336" watchObservedRunningTime="2026-03-17 00:27:59.991905282 +0000 UTC m=+354.751357565" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.181174 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561788-jl6wp"] Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.181987 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.185789 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.186054 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.186253 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.189666 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561788-jl6wp"] Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.262579 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfslq\" (UniqueName: \"kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq\") pod \"auto-csr-approver-29561788-jl6wp\" (UID: \"579fbdac-fb6c-4aaf-9c5d-a24494764bb0\") " pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.363797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfslq\" (UniqueName: \"kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq\") pod \"auto-csr-approver-29561788-jl6wp\" (UID: \"579fbdac-fb6c-4aaf-9c5d-a24494764bb0\") " pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.386940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfslq\" (UniqueName: \"kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq\") pod \"auto-csr-approver-29561788-jl6wp\" (UID: \"579fbdac-fb6c-4aaf-9c5d-a24494764bb0\") " pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.510336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.926233 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561788-jl6wp"] Mar 17 00:28:00 crc kubenswrapper[4755]: W0317 00:28:00.931666 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579fbdac_fb6c_4aaf_9c5d_a24494764bb0.slice/crio-985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6 WatchSource:0}: Error finding container 985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6: Status 404 returned error can't find the container with id 985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6 Mar 17 00:28:00 crc kubenswrapper[4755]: I0317 00:28:00.969036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" event={"ID":"579fbdac-fb6c-4aaf-9c5d-a24494764bb0","Type":"ContainerStarted","Data":"985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6"} Mar 17 00:28:02 crc kubenswrapper[4755]: I0317 00:28:02.981824 4755 generic.go:334] "Generic (PLEG): container finished" podID="579fbdac-fb6c-4aaf-9c5d-a24494764bb0" containerID="b1ec5d6a4fdc24f25074bca55cbf7a7a4d1f47dd11b3a80d740f8f71cab41481" exitCode=0 Mar 17 00:28:02 crc kubenswrapper[4755]: I0317 00:28:02.982075 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" event={"ID":"579fbdac-fb6c-4aaf-9c5d-a24494764bb0","Type":"ContainerDied","Data":"b1ec5d6a4fdc24f25074bca55cbf7a7a4d1f47dd11b3a80d740f8f71cab41481"} Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.337358 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.513975 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfslq\" (UniqueName: \"kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq\") pod \"579fbdac-fb6c-4aaf-9c5d-a24494764bb0\" (UID: \"579fbdac-fb6c-4aaf-9c5d-a24494764bb0\") " Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.520181 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq" (OuterVolumeSpecName: "kube-api-access-xfslq") pod "579fbdac-fb6c-4aaf-9c5d-a24494764bb0" (UID: "579fbdac-fb6c-4aaf-9c5d-a24494764bb0"). InnerVolumeSpecName "kube-api-access-xfslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.615959 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfslq\" (UniqueName: \"kubernetes.io/projected/579fbdac-fb6c-4aaf-9c5d-a24494764bb0-kube-api-access-xfslq\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.994333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" event={"ID":"579fbdac-fb6c-4aaf-9c5d-a24494764bb0","Type":"ContainerDied","Data":"985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6"} Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.994369 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="985be9bf56ba0eb215a94f7587124da1caaee46c27f0e79cca9507dc6729a0a6" Mar 17 00:28:04 crc kubenswrapper[4755]: I0317 00:28:04.994412 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561788-jl6wp" Mar 17 00:28:13 crc kubenswrapper[4755]: I0317 00:28:13.505132 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:28:13 crc kubenswrapper[4755]: I0317 00:28:13.505880 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" podUID="272fb1e6-4011-4703-ab0f-6985bd10bac7" containerName="route-controller-manager" containerID="cri-o://c77a2693c8b2f4db6f385019aa4755e0e81742d0ac28b0a839ffa1c32e9fc2b4" gracePeriod=30 Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.053251 4755 generic.go:334] "Generic (PLEG): container finished" podID="272fb1e6-4011-4703-ab0f-6985bd10bac7" containerID="c77a2693c8b2f4db6f385019aa4755e0e81742d0ac28b0a839ffa1c32e9fc2b4" exitCode=0 Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.053343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" event={"ID":"272fb1e6-4011-4703-ab0f-6985bd10bac7","Type":"ContainerDied","Data":"c77a2693c8b2f4db6f385019aa4755e0e81742d0ac28b0a839ffa1c32e9fc2b4"} Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.053756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" event={"ID":"272fb1e6-4011-4703-ab0f-6985bd10bac7","Type":"ContainerDied","Data":"61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20"} Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.053766 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61c9b18d55c4e8953da4016d743d08594d22e852d93bec2cc3d41d4559623d20" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.062239 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.246905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca\") pod \"272fb1e6-4011-4703-ab0f-6985bd10bac7\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.246953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config\") pod \"272fb1e6-4011-4703-ab0f-6985bd10bac7\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.246978 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert\") pod \"272fb1e6-4011-4703-ab0f-6985bd10bac7\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.247013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwmf\" (UniqueName: \"kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf\") pod \"272fb1e6-4011-4703-ab0f-6985bd10bac7\" (UID: \"272fb1e6-4011-4703-ab0f-6985bd10bac7\") " Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.247958 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca" (OuterVolumeSpecName: "client-ca") pod "272fb1e6-4011-4703-ab0f-6985bd10bac7" (UID: "272fb1e6-4011-4703-ab0f-6985bd10bac7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.248246 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config" (OuterVolumeSpecName: "config") pod "272fb1e6-4011-4703-ab0f-6985bd10bac7" (UID: "272fb1e6-4011-4703-ab0f-6985bd10bac7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.260766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf" (OuterVolumeSpecName: "kube-api-access-npwmf") pod "272fb1e6-4011-4703-ab0f-6985bd10bac7" (UID: "272fb1e6-4011-4703-ab0f-6985bd10bac7"). InnerVolumeSpecName "kube-api-access-npwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.261191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "272fb1e6-4011-4703-ab0f-6985bd10bac7" (UID: "272fb1e6-4011-4703-ab0f-6985bd10bac7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.348811 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.348902 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/272fb1e6-4011-4703-ab0f-6985bd10bac7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.348925 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwmf\" (UniqueName: \"kubernetes.io/projected/272fb1e6-4011-4703-ab0f-6985bd10bac7-kube-api-access-npwmf\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.348951 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/272fb1e6-4011-4703-ab0f-6985bd10bac7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.746470 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk"] Mar 17 00:28:14 crc kubenswrapper[4755]: E0317 00:28:14.746728 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579fbdac-fb6c-4aaf-9c5d-a24494764bb0" containerName="oc" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.746748 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="579fbdac-fb6c-4aaf-9c5d-a24494764bb0" containerName="oc" Mar 17 00:28:14 crc kubenswrapper[4755]: E0317 00:28:14.746762 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272fb1e6-4011-4703-ab0f-6985bd10bac7" containerName="route-controller-manager" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.746770 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="272fb1e6-4011-4703-ab0f-6985bd10bac7" containerName="route-controller-manager" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.746893 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="272fb1e6-4011-4703-ab0f-6985bd10bac7" containerName="route-controller-manager" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.746919 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="579fbdac-fb6c-4aaf-9c5d-a24494764bb0" containerName="oc" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.747364 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.807971 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk"] Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.862667 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9b65\" (UniqueName: \"kubernetes.io/projected/e0be7b5a-a079-4d49-8afc-fd29b942fb13-kube-api-access-w9b65\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.862858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-config\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.863086 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-client-ca\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.863130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0be7b5a-a079-4d49-8afc-fd29b942fb13-serving-cert\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.964976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-client-ca\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.965035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0be7b5a-a079-4d49-8afc-fd29b942fb13-serving-cert\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.965115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9b65\" (UniqueName: \"kubernetes.io/projected/e0be7b5a-a079-4d49-8afc-fd29b942fb13-kube-api-access-w9b65\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.965143 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-config\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.966661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-config\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.967922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0be7b5a-a079-4d49-8afc-fd29b942fb13-client-ca\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.987331 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9b65\" (UniqueName: \"kubernetes.io/projected/e0be7b5a-a079-4d49-8afc-fd29b942fb13-kube-api-access-w9b65\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:14 crc kubenswrapper[4755]: I0317 00:28:14.987305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0be7b5a-a079-4d49-8afc-fd29b942fb13-serving-cert\") pod \"route-controller-manager-5795f5dc57-9m9gk\" (UID: \"e0be7b5a-a079-4d49-8afc-fd29b942fb13\") " pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:15 crc kubenswrapper[4755]: I0317 00:28:15.058760 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9" Mar 17 00:28:15 crc kubenswrapper[4755]: I0317 00:28:15.068768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:15 crc kubenswrapper[4755]: I0317 00:28:15.101514 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:28:15 crc kubenswrapper[4755]: I0317 00:28:15.108622 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647f6fbcc5-b4qb9"] Mar 17 00:28:15 crc kubenswrapper[4755]: I0317 00:28:15.561600 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk"] Mar 17 00:28:15 crc kubenswrapper[4755]: W0317 00:28:15.572308 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0be7b5a_a079_4d49_8afc_fd29b942fb13.slice/crio-9e3ced397a247094e722aeb801c3c5247b2e5a6a17b0e8e9461f07415fca0890 WatchSource:0}: Error finding container 9e3ced397a247094e722aeb801c3c5247b2e5a6a17b0e8e9461f07415fca0890: Status 404 returned error can't find the container with id 9e3ced397a247094e722aeb801c3c5247b2e5a6a17b0e8e9461f07415fca0890 Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.068952 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" event={"ID":"e0be7b5a-a079-4d49-8afc-fd29b942fb13","Type":"ContainerStarted","Data":"62d650591a4bff76f3ea2df2363db44d31e5041e050cf5cf6d1653b892f7b3bf"} Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.069396 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" event={"ID":"e0be7b5a-a079-4d49-8afc-fd29b942fb13","Type":"ContainerStarted","Data":"9e3ced397a247094e722aeb801c3c5247b2e5a6a17b0e8e9461f07415fca0890"} Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.069429 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.104777 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" podStartSLOduration=3.104749342 podStartE2EDuration="3.104749342s" podCreationTimestamp="2026-03-17 00:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:28:16.09462303 +0000 UTC m=+370.854075363" watchObservedRunningTime="2026-03-17 00:28:16.104749342 +0000 UTC m=+370.864201665" Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.255561 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272fb1e6-4011-4703-ab0f-6985bd10bac7" path="/var/lib/kubelet/pods/272fb1e6-4011-4703-ab0f-6985bd10bac7/volumes" Mar 17 00:28:16 crc kubenswrapper[4755]: I0317 00:28:16.525877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5795f5dc57-9m9gk" Mar 17 00:28:19 crc kubenswrapper[4755]: I0317 00:28:19.364641 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mrlv2" Mar 17 00:28:19 crc kubenswrapper[4755]: I0317 00:28:19.450850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.086895 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.087805 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m967j" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="registry-server" containerID="cri-o://a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb" gracePeriod=30 Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.102398 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.103049 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dwldz" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="registry-server" containerID="cri-o://d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827" gracePeriod=30 Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.113512 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.113836 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" containerID="cri-o://a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d" gracePeriod=30 Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.118204 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.118746 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvz7c" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="registry-server" containerID="cri-o://7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d" gracePeriod=30 Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.122827 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvsj8"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.123678 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.137769 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.138156 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ctv9" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="registry-server" containerID="cri-o://abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293" gracePeriod=30 Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.144357 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvsj8"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.234194 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.234260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchqz\" (UniqueName: \"kubernetes.io/projected/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-kube-api-access-bchqz\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.234311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: E0317 00:28:20.305242 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d67a491_1c7f_4898_bc78_a2a7d75278dc.slice/crio-conmon-a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.336013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.336066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchqz\" (UniqueName: \"kubernetes.io/projected/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-kube-api-access-bchqz\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.336117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.337522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.343242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.354850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchqz\" (UniqueName: \"kubernetes.io/projected/277ca8b8-67f5-4fdb-ad34-648ad653fa5d-kube-api-access-bchqz\") pod \"marketplace-operator-79b997595-mvsj8\" (UID: \"277ca8b8-67f5-4fdb-ad34-648ad653fa5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.457953 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.603715 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.621427 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.663213 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.665158 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.688774 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.757337 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s4wc\" (UniqueName: \"kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc\") pod \"be55626c-4d34-4b09-83b0-897cd661216a\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw768\" (UniqueName: \"kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768\") pod \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btqdp\" (UniqueName: \"kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp\") pod \"e1339920-3dec-4332-9749-ec66520252cb\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758288 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities\") pod \"be55626c-4d34-4b09-83b0-897cd661216a\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758331 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities\") pod \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758358 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics\") pod \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758378 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content\") pod \"be55626c-4d34-4b09-83b0-897cd661216a\" (UID: \"be55626c-4d34-4b09-83b0-897cd661216a\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content\") pod \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\" (UID: \"3857bbbe-1aa6-43d2-94e4-15f23929ac60\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758472 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca\") pod \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content\") pod \"e1339920-3dec-4332-9749-ec66520252cb\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758508 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities\") pod \"e1339920-3dec-4332-9749-ec66520252cb\" (UID: \"e1339920-3dec-4332-9749-ec66520252cb\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.758526 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhhdw\" (UniqueName: \"kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw\") pod \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\" (UID: \"7d67a491-1c7f-4898-bc78-a2a7d75278dc\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.763321 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7d67a491-1c7f-4898-bc78-a2a7d75278dc" (UID: "7d67a491-1c7f-4898-bc78-a2a7d75278dc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.764907 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc" (OuterVolumeSpecName: "kube-api-access-5s4wc") pod "be55626c-4d34-4b09-83b0-897cd661216a" (UID: "be55626c-4d34-4b09-83b0-897cd661216a"). InnerVolumeSpecName "kube-api-access-5s4wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.765791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7d67a491-1c7f-4898-bc78-a2a7d75278dc" (UID: "7d67a491-1c7f-4898-bc78-a2a7d75278dc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.765998 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw" (OuterVolumeSpecName: "kube-api-access-nhhdw") pod "7d67a491-1c7f-4898-bc78-a2a7d75278dc" (UID: "7d67a491-1c7f-4898-bc78-a2a7d75278dc"). InnerVolumeSpecName "kube-api-access-nhhdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.766890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities" (OuterVolumeSpecName: "utilities") pod "3857bbbe-1aa6-43d2-94e4-15f23929ac60" (UID: "3857bbbe-1aa6-43d2-94e4-15f23929ac60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.767429 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities" (OuterVolumeSpecName: "utilities") pod "be55626c-4d34-4b09-83b0-897cd661216a" (UID: "be55626c-4d34-4b09-83b0-897cd661216a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.767627 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768" (OuterVolumeSpecName: "kube-api-access-gw768") pod "3857bbbe-1aa6-43d2-94e4-15f23929ac60" (UID: "3857bbbe-1aa6-43d2-94e4-15f23929ac60"). InnerVolumeSpecName "kube-api-access-gw768". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.768196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities" (OuterVolumeSpecName: "utilities") pod "e1339920-3dec-4332-9749-ec66520252cb" (UID: "e1339920-3dec-4332-9749-ec66520252cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.769214 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp" (OuterVolumeSpecName: "kube-api-access-btqdp") pod "e1339920-3dec-4332-9749-ec66520252cb" (UID: "e1339920-3dec-4332-9749-ec66520252cb"). InnerVolumeSpecName "kube-api-access-btqdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.795981 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3857bbbe-1aa6-43d2-94e4-15f23929ac60" (UID: "3857bbbe-1aa6-43d2-94e4-15f23929ac60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.830226 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be55626c-4d34-4b09-83b0-897cd661216a" (UID: "be55626c-4d34-4b09-83b0-897cd661216a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.833154 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1339920-3dec-4332-9749-ec66520252cb" (UID: "e1339920-3dec-4332-9749-ec66520252cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.859736 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content\") pod \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860048 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities\") pod \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860114 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zttl\" (UniqueName: \"kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl\") pod \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\" (UID: \"b20a58b5-4b64-4d7b-b9c2-c6170d75878e\") " Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860577 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860654 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860682 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860736 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860732 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities" (OuterVolumeSpecName: "utilities") pod "b20a58b5-4b64-4d7b-b9c2-c6170d75878e" (UID: "b20a58b5-4b64-4d7b-b9c2-c6170d75878e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860755 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d67a491-1c7f-4898-bc78-a2a7d75278dc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860810 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1339920-3dec-4332-9749-ec66520252cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860830 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhhdw\" (UniqueName: \"kubernetes.io/projected/7d67a491-1c7f-4898-bc78-a2a7d75278dc-kube-api-access-nhhdw\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860846 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s4wc\" (UniqueName: \"kubernetes.io/projected/be55626c-4d34-4b09-83b0-897cd661216a-kube-api-access-5s4wc\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860858 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw768\" (UniqueName: \"kubernetes.io/projected/3857bbbe-1aa6-43d2-94e4-15f23929ac60-kube-api-access-gw768\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860871 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btqdp\" (UniqueName: \"kubernetes.io/projected/e1339920-3dec-4332-9749-ec66520252cb-kube-api-access-btqdp\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860884 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55626c-4d34-4b09-83b0-897cd661216a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.860897 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857bbbe-1aa6-43d2-94e4-15f23929ac60-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.862813 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl" (OuterVolumeSpecName: "kube-api-access-5zttl") pod "b20a58b5-4b64-4d7b-b9c2-c6170d75878e" (UID: "b20a58b5-4b64-4d7b-b9c2-c6170d75878e"). InnerVolumeSpecName "kube-api-access-5zttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.964000 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.964071 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zttl\" (UniqueName: \"kubernetes.io/projected/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-kube-api-access-5zttl\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.984781 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvsj8"] Mar 17 00:28:20 crc kubenswrapper[4755]: I0317 00:28:20.998722 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b20a58b5-4b64-4d7b-b9c2-c6170d75878e" (UID: "b20a58b5-4b64-4d7b-b9c2-c6170d75878e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.065155 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20a58b5-4b64-4d7b-b9c2-c6170d75878e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.104708 4755 generic.go:334] "Generic (PLEG): container finished" podID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerID="7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d" exitCode=0 Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.104750 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerDied","Data":"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.104790 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvz7c" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.104855 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvz7c" event={"ID":"3857bbbe-1aa6-43d2-94e4-15f23929ac60","Type":"ContainerDied","Data":"1bb668ca2abb9cd31d98831ea0eb6fdca06888ba41056fbfb60fc61109a78ec3"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.104882 4755 scope.go:117] "RemoveContainer" containerID="7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.109224 4755 generic.go:334] "Generic (PLEG): container finished" podID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerID="abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293" exitCode=0 Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.109286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerDied","Data":"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.109309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ctv9" event={"ID":"b20a58b5-4b64-4d7b-b9c2-c6170d75878e","Type":"ContainerDied","Data":"f3bef55f66ff690d30b0c2faf490b12cd17b5500a1467ef1c7e83d5c12e4db6d"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.109368 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ctv9" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.113091 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.113149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" event={"ID":"7d67a491-1c7f-4898-bc78-a2a7d75278dc","Type":"ContainerDied","Data":"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.112992 4755 generic.go:334] "Generic (PLEG): container finished" podID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerID="a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d" exitCode=0 Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.119037 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tddtz" event={"ID":"7d67a491-1c7f-4898-bc78-a2a7d75278dc","Type":"ContainerDied","Data":"619fb26e1dc3d5a0deb3ec28a94ebb9285312a914ad71e648572c1abb5eed11d"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.122165 4755 scope.go:117] "RemoveContainer" containerID="4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.128232 4755 generic.go:334] "Generic (PLEG): container finished" podID="be55626c-4d34-4b09-83b0-897cd661216a" containerID="a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb" exitCode=0 Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.128306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerDied","Data":"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.128334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m967j" event={"ID":"be55626c-4d34-4b09-83b0-897cd661216a","Type":"ContainerDied","Data":"be322b4f8f6db9468e86441bb73a57ec43d85c2302c1e83fe1396337b4f667d7"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.128335 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m967j" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.144731 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1339920-3dec-4332-9749-ec66520252cb" containerID="d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827" exitCode=0 Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.144787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerDied","Data":"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.144813 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwldz" event={"ID":"e1339920-3dec-4332-9749-ec66520252cb","Type":"ContainerDied","Data":"72a363294e8a77273c5f3716c52af17809feee439b2ec7a15a83c590aa0e935d"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.144883 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwldz" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.148753 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" event={"ID":"277ca8b8-67f5-4fdb-ad34-648ad653fa5d","Type":"ContainerStarted","Data":"d320555baa40b769625915d258df0325797942799b3661d275a7dca31ac31adf"} Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.156291 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.162655 4755 scope.go:117] "RemoveContainer" containerID="20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.169089 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvz7c"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.176964 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.185023 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ctv9"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.189650 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.193403 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m967j"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.194738 4755 scope.go:117] "RemoveContainer" containerID="7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.195133 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d\": container with ID starting with 7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d not found: ID does not exist" containerID="7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195174 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d"} err="failed to get container status \"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d\": rpc error: code = NotFound desc = could not find container \"7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d\": container with ID starting with 7bbca8ca31b4a7ab536f66c60d337ab09b90555414f028a8917e891380eca21d not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195199 4755 scope.go:117] "RemoveContainer" containerID="4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.195589 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2\": container with ID starting with 4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2 not found: ID does not exist" containerID="4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195622 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2"} err="failed to get container status \"4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2\": rpc error: code = NotFound desc = could not find container \"4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2\": container with ID starting with 4b880bf96bf5e87a502b33e9f046cf6b3d8aea0bef723df67f87e99ac06a96d2 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195643 4755 scope.go:117] "RemoveContainer" containerID="20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.195910 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91\": container with ID starting with 20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91 not found: ID does not exist" containerID="20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195939 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91"} err="failed to get container status \"20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91\": rpc error: code = NotFound desc = could not find container \"20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91\": container with ID starting with 20c122ee34f59df40962f1481cb183f82e73b38406278af0cc045e2a773dac91 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.195956 4755 scope.go:117] "RemoveContainer" containerID="abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.198978 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.209617 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tddtz"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.209668 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.211882 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dwldz"] Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.221572 4755 scope.go:117] "RemoveContainer" containerID="45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.263616 4755 scope.go:117] "RemoveContainer" containerID="586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.290592 4755 scope.go:117] "RemoveContainer" containerID="abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.291028 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293\": container with ID starting with abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293 not found: ID does not exist" containerID="abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.291057 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293"} err="failed to get container status \"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293\": rpc error: code = NotFound desc = could not find container \"abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293\": container with ID starting with abcd9f3c2446f0d0776b0f37949b763921ff095e3ad0a3f00cc8a54639ee0293 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.291086 4755 scope.go:117] "RemoveContainer" containerID="45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.291456 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea\": container with ID starting with 45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea not found: ID does not exist" containerID="45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.291500 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea"} err="failed to get container status \"45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea\": rpc error: code = NotFound desc = could not find container \"45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea\": container with ID starting with 45c734a3e9740baa167d15e6ec64b6263a728cfff8ba5a75c391206b257fd2ea not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.291532 4755 scope.go:117] "RemoveContainer" containerID="586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.291963 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5\": container with ID starting with 586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5 not found: ID does not exist" containerID="586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.292019 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5"} err="failed to get container status \"586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5\": rpc error: code = NotFound desc = could not find container \"586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5\": container with ID starting with 586b4322426303b975a460c465e5ad0d203496e315f8a0e1e153341f9bf2d6c5 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.292053 4755 scope.go:117] "RemoveContainer" containerID="a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.305379 4755 scope.go:117] "RemoveContainer" containerID="a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.305953 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d\": container with ID starting with a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d not found: ID does not exist" containerID="a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.306009 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d"} err="failed to get container status \"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d\": rpc error: code = NotFound desc = could not find container \"a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d\": container with ID starting with a772e7ca39b55b1bbc52c118365ef011037a6b207610b693e2dde677b4428e2d not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.306046 4755 scope.go:117] "RemoveContainer" containerID="a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.320661 4755 scope.go:117] "RemoveContainer" containerID="29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.334513 4755 scope.go:117] "RemoveContainer" containerID="6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.350929 4755 scope.go:117] "RemoveContainer" containerID="a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.351390 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb\": container with ID starting with a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb not found: ID does not exist" containerID="a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.351506 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb"} err="failed to get container status \"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb\": rpc error: code = NotFound desc = could not find container \"a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb\": container with ID starting with a5f2bda6413b4e3d40328259045c4909479a516453f9b23a955b9457489754bb not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.351551 4755 scope.go:117] "RemoveContainer" containerID="29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.352078 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3\": container with ID starting with 29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3 not found: ID does not exist" containerID="29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.352120 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3"} err="failed to get container status \"29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3\": rpc error: code = NotFound desc = could not find container \"29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3\": container with ID starting with 29fa6ae93f4ebd3fcda31482144f1a66daa5e8c56cecfa98a55c29bede086ee3 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.352156 4755 scope.go:117] "RemoveContainer" containerID="6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.352597 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec\": container with ID starting with 6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec not found: ID does not exist" containerID="6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.352649 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec"} err="failed to get container status \"6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec\": rpc error: code = NotFound desc = could not find container \"6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec\": container with ID starting with 6265e2633ed9bf3d525ef3ba1fdb94d776a87a7139953df97cab4279f73026ec not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.352679 4755 scope.go:117] "RemoveContainer" containerID="d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.368181 4755 scope.go:117] "RemoveContainer" containerID="d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.388160 4755 scope.go:117] "RemoveContainer" containerID="2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.405820 4755 scope.go:117] "RemoveContainer" containerID="d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.406281 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827\": container with ID starting with d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827 not found: ID does not exist" containerID="d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.406313 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827"} err="failed to get container status \"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827\": rpc error: code = NotFound desc = could not find container \"d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827\": container with ID starting with d74ec6700a94558bf2e670abb4438b6780a67d785e1d784d296c8c5519223827 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.406335 4755 scope.go:117] "RemoveContainer" containerID="d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.406641 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5\": container with ID starting with d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5 not found: ID does not exist" containerID="d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.406674 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5"} err="failed to get container status \"d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5\": rpc error: code = NotFound desc = could not find container \"d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5\": container with ID starting with d68eff44d3c25774e188230f8742340ea9247c03af92d9854059330950d0ded5 not found: ID does not exist" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.406693 4755 scope.go:117] "RemoveContainer" containerID="2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20" Mar 17 00:28:21 crc kubenswrapper[4755]: E0317 00:28:21.406963 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20\": container with ID starting with 2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20 not found: ID does not exist" containerID="2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20" Mar 17 00:28:21 crc kubenswrapper[4755]: I0317 00:28:21.406987 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20"} err="failed to get container status \"2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20\": rpc error: code = NotFound desc = could not find container \"2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20\": container with ID starting with 2bc8fda5849d44dd5480472677c6baf9fc7db608c976d1c97bef6a5d425b4c20 not found: ID does not exist" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.156406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" event={"ID":"277ca8b8-67f5-4fdb-ad34-648ad653fa5d","Type":"ContainerStarted","Data":"cec38a93397a71f86fd19efca8bd23ee29e7d3bf04ac16cea071e836eca5fe58"} Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.156824 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.159991 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.177138 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mvsj8" podStartSLOduration=2.177123243 podStartE2EDuration="2.177123243s" podCreationTimestamp="2026-03-17 00:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:28:22.174397647 +0000 UTC m=+376.933849940" watchObservedRunningTime="2026-03-17 00:28:22.177123243 +0000 UTC m=+376.936575526" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.261496 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" path="/var/lib/kubelet/pods/3857bbbe-1aa6-43d2-94e4-15f23929ac60/volumes" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.262189 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" path="/var/lib/kubelet/pods/7d67a491-1c7f-4898-bc78-a2a7d75278dc/volumes" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.262696 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" path="/var/lib/kubelet/pods/b20a58b5-4b64-4d7b-b9c2-c6170d75878e/volumes" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.263647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be55626c-4d34-4b09-83b0-897cd661216a" path="/var/lib/kubelet/pods/be55626c-4d34-4b09-83b0-897cd661216a/volumes" Mar 17 00:28:22 crc kubenswrapper[4755]: I0317 00:28:22.264167 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1339920-3dec-4332-9749-ec66520252cb" path="/var/lib/kubelet/pods/e1339920-3dec-4332-9749-ec66520252cb/volumes" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296583 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p727d"] Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296817 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296832 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296849 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296857 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296871 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296881 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296891 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296898 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296909 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296916 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296929 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296935 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296946 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296953 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296963 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296970 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.296980 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.296988 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.297000 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297007 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.297018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297026 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="extract-utilities" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.297037 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297044 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="extract-content" Mar 17 00:28:23 crc kubenswrapper[4755]: E0317 00:28:23.297054 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297162 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20a58b5-4b64-4d7b-b9c2-c6170d75878e" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297178 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3857bbbe-1aa6-43d2-94e4-15f23929ac60" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297191 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d67a491-1c7f-4898-bc78-a2a7d75278dc" containerName="marketplace-operator" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297201 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="be55626c-4d34-4b09-83b0-897cd661216a" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.297208 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1339920-3dec-4332-9749-ec66520252cb" containerName="registry-server" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.298083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.300352 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.309760 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p727d"] Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.392502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j99xh\" (UniqueName: \"kubernetes.io/projected/ac12f55c-136b-4cf3-aae6-dca7f5353189-kube-api-access-j99xh\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.392578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-catalog-content\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.392651 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-utilities\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.493459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-utilities\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.493512 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j99xh\" (UniqueName: \"kubernetes.io/projected/ac12f55c-136b-4cf3-aae6-dca7f5353189-kube-api-access-j99xh\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.493542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-catalog-content\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.493922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-catalog-content\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.494138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac12f55c-136b-4cf3-aae6-dca7f5353189-utilities\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.497046 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.499183 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.503032 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.506829 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.522677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j99xh\" (UniqueName: \"kubernetes.io/projected/ac12f55c-136b-4cf3-aae6-dca7f5353189-kube-api-access-j99xh\") pod \"redhat-marketplace-p727d\" (UID: \"ac12f55c-136b-4cf3-aae6-dca7f5353189\") " pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.594985 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.595367 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.595415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnv7w\" (UniqueName: \"kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.614235 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.696260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.696318 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.696342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnv7w\" (UniqueName: \"kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.696983 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.697139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.726945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnv7w\" (UniqueName: \"kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w\") pod \"redhat-operators-5g7nk\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:23 crc kubenswrapper[4755]: I0317 00:28:23.839631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:24 crc kubenswrapper[4755]: I0317 00:28:24.030960 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p727d"] Mar 17 00:28:24 crc kubenswrapper[4755]: I0317 00:28:24.088569 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 00:28:24 crc kubenswrapper[4755]: I0317 00:28:24.176722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerStarted","Data":"c1a9e6f2209924c71dd38e7f08893f695134629f7ffdb62ecb26891f93cd643a"} Mar 17 00:28:24 crc kubenswrapper[4755]: I0317 00:28:24.176783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerStarted","Data":"b05f6ebe58f9389a9d85019ed674b407fa8d8c156189e43e2b7712846f162fba"} Mar 17 00:28:24 crc kubenswrapper[4755]: I0317 00:28:24.179564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerStarted","Data":"f790818efa63722105688585dff507dec2d0535c2eea25b8a5c9a6e2f8dd819b"} Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.186489 4755 generic.go:334] "Generic (PLEG): container finished" podID="ac12f55c-136b-4cf3-aae6-dca7f5353189" containerID="c1a9e6f2209924c71dd38e7f08893f695134629f7ffdb62ecb26891f93cd643a" exitCode=0 Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.186598 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerDied","Data":"c1a9e6f2209924c71dd38e7f08893f695134629f7ffdb62ecb26891f93cd643a"} Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.188063 4755 generic.go:334] "Generic (PLEG): container finished" podID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerID="eff83594d1ec24daf6492fb27a5471811f3679e716cf509b99cd80b71976c7d2" exitCode=0 Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.188103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerDied","Data":"eff83594d1ec24daf6492fb27a5471811f3679e716cf509b99cd80b71976c7d2"} Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.707164 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.709665 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.711846 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.716780 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.826059 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbm4r\" (UniqueName: \"kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.826189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.826288 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.911997 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.916157 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.918173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.927241 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.927340 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbm4r\" (UniqueName: \"kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.927392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.928011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.928082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.929366 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:28:25 crc kubenswrapper[4755]: I0317 00:28:25.954986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbm4r\" (UniqueName: \"kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r\") pod \"certified-operators-bc8sz\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.029097 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.029464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.029532 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km7bk\" (UniqueName: \"kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.034004 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.130239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km7bk\" (UniqueName: \"kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.130318 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.130338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.131008 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.131474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.181056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km7bk\" (UniqueName: \"kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk\") pod \"community-operators-95x6m\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.197538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerStarted","Data":"a98c6e45af24facc1f7414aa93e3e3bdcb5798a3851dc59d9d7c08dda718bfcd"} Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.200677 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerStarted","Data":"ccb7245c4f7b11f637f23b8f323e90f2e38dd035813885cc4e344be479345c16"} Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.321065 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.521319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:28:26 crc kubenswrapper[4755]: I0317 00:28:26.527333 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 00:28:26 crc kubenswrapper[4755]: W0317 00:28:26.527488 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d94368_96d9_44da_ac5a_af29d6b0d3df.slice/crio-3b72e398562c5d08f6b94cdde19fca2a8ded851782286e2245ac04c4bdfd488a WatchSource:0}: Error finding container 3b72e398562c5d08f6b94cdde19fca2a8ded851782286e2245ac04c4bdfd488a: Status 404 returned error can't find the container with id 3b72e398562c5d08f6b94cdde19fca2a8ded851782286e2245ac04c4bdfd488a Mar 17 00:28:26 crc kubenswrapper[4755]: W0317 00:28:26.531170 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf109104_00dd_4525_a6e2_31cfccf54d5d.slice/crio-491b775dc469d3a6a501c51ec9684cae0274d949757c215a345e2b3e95984910 WatchSource:0}: Error finding container 491b775dc469d3a6a501c51ec9684cae0274d949757c215a345e2b3e95984910: Status 404 returned error can't find the container with id 491b775dc469d3a6a501c51ec9684cae0274d949757c215a345e2b3e95984910 Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.207531 4755 generic.go:334] "Generic (PLEG): container finished" podID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerID="ccb7245c4f7b11f637f23b8f323e90f2e38dd035813885cc4e344be479345c16" exitCode=0 Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.207604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerDied","Data":"ccb7245c4f7b11f637f23b8f323e90f2e38dd035813885cc4e344be479345c16"} Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.209078 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerID="fe3349d5e4e7a6d140f96ac22d7c4cae9134844b89cd34d59fbadc280b59c5d3" exitCode=0 Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.209163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerDied","Data":"fe3349d5e4e7a6d140f96ac22d7c4cae9134844b89cd34d59fbadc280b59c5d3"} Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.209366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerStarted","Data":"3b72e398562c5d08f6b94cdde19fca2a8ded851782286e2245ac04c4bdfd488a"} Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.212090 4755 generic.go:334] "Generic (PLEG): container finished" podID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerID="980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118" exitCode=0 Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.212136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerDied","Data":"980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118"} Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.212371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerStarted","Data":"491b775dc469d3a6a501c51ec9684cae0274d949757c215a345e2b3e95984910"} Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.214789 4755 generic.go:334] "Generic (PLEG): container finished" podID="ac12f55c-136b-4cf3-aae6-dca7f5353189" containerID="a98c6e45af24facc1f7414aa93e3e3bdcb5798a3851dc59d9d7c08dda718bfcd" exitCode=0 Mar 17 00:28:27 crc kubenswrapper[4755]: I0317 00:28:27.214822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerDied","Data":"a98c6e45af24facc1f7414aa93e3e3bdcb5798a3851dc59d9d7c08dda718bfcd"} Mar 17 00:28:28 crc kubenswrapper[4755]: I0317 00:28:28.222428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p727d" event={"ID":"ac12f55c-136b-4cf3-aae6-dca7f5353189","Type":"ContainerStarted","Data":"bf1278f29fb0af924bf83dcc4ff7fa1cce58ae1c3e09b668a01deb9debc99878"} Mar 17 00:28:28 crc kubenswrapper[4755]: I0317 00:28:28.224940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerStarted","Data":"e7b534284b3928e7d09b174d4a9a52910f4c898daf798967ca4cb035ac357358"} Mar 17 00:28:28 crc kubenswrapper[4755]: I0317 00:28:28.239647 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p727d" podStartSLOduration=2.611603637 podStartE2EDuration="5.239627109s" podCreationTimestamp="2026-03-17 00:28:23 +0000 UTC" firstStartedPulling="2026-03-17 00:28:25.188254621 +0000 UTC m=+379.947706904" lastFinishedPulling="2026-03-17 00:28:27.816278103 +0000 UTC m=+382.575730376" observedRunningTime="2026-03-17 00:28:28.23821749 +0000 UTC m=+382.997669783" watchObservedRunningTime="2026-03-17 00:28:28.239627109 +0000 UTC m=+382.999079392" Mar 17 00:28:28 crc kubenswrapper[4755]: I0317 00:28:28.258708 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5g7nk" podStartSLOduration=2.643980209 podStartE2EDuration="5.25868746s" podCreationTimestamp="2026-03-17 00:28:23 +0000 UTC" firstStartedPulling="2026-03-17 00:28:25.189980679 +0000 UTC m=+379.949432962" lastFinishedPulling="2026-03-17 00:28:27.80468793 +0000 UTC m=+382.564140213" observedRunningTime="2026-03-17 00:28:28.257818065 +0000 UTC m=+383.017270388" watchObservedRunningTime="2026-03-17 00:28:28.25868746 +0000 UTC m=+383.018139743" Mar 17 00:28:29 crc kubenswrapper[4755]: I0317 00:28:29.232444 4755 generic.go:334] "Generic (PLEG): container finished" podID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerID="f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21" exitCode=0 Mar 17 00:28:29 crc kubenswrapper[4755]: I0317 00:28:29.232553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerDied","Data":"f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21"} Mar 17 00:28:29 crc kubenswrapper[4755]: I0317 00:28:29.235768 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerID="6e39b2a677bfcb03f7705830696cefb2bdb41b0dda8197067d34f9ad976fc623" exitCode=0 Mar 17 00:28:29 crc kubenswrapper[4755]: I0317 00:28:29.235939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerDied","Data":"6e39b2a677bfcb03f7705830696cefb2bdb41b0dda8197067d34f9ad976fc623"} Mar 17 00:28:30 crc kubenswrapper[4755]: I0317 00:28:30.243483 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerStarted","Data":"4165c4b2fafd06a8413712cd22b9fcaa07b4d3120681eb084fdbb8257f4a13c7"} Mar 17 00:28:30 crc kubenswrapper[4755]: I0317 00:28:30.245848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerStarted","Data":"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982"} Mar 17 00:28:30 crc kubenswrapper[4755]: I0317 00:28:30.279834 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95x6m" podStartSLOduration=2.397731682 podStartE2EDuration="5.279813336s" podCreationTimestamp="2026-03-17 00:28:25 +0000 UTC" firstStartedPulling="2026-03-17 00:28:27.210204341 +0000 UTC m=+381.969656634" lastFinishedPulling="2026-03-17 00:28:30.092286005 +0000 UTC m=+384.851738288" observedRunningTime="2026-03-17 00:28:30.274974472 +0000 UTC m=+385.034426765" watchObservedRunningTime="2026-03-17 00:28:30.279813336 +0000 UTC m=+385.039265629" Mar 17 00:28:30 crc kubenswrapper[4755]: I0317 00:28:30.297839 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bc8sz" podStartSLOduration=2.711595929 podStartE2EDuration="5.297821878s" podCreationTimestamp="2026-03-17 00:28:25 +0000 UTC" firstStartedPulling="2026-03-17 00:28:27.213398079 +0000 UTC m=+381.972850402" lastFinishedPulling="2026-03-17 00:28:29.799624058 +0000 UTC m=+384.559076351" observedRunningTime="2026-03-17 00:28:30.29541019 +0000 UTC m=+385.054862473" watchObservedRunningTime="2026-03-17 00:28:30.297821878 +0000 UTC m=+385.057274151" Mar 17 00:28:32 crc kubenswrapper[4755]: I0317 00:28:32.951406 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:28:32 crc kubenswrapper[4755]: I0317 00:28:32.951990 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" podUID="b85a22d9-d3c4-4c6a-8646-9337230979b5" containerName="controller-manager" containerID="cri-o://3e09f4345ce1d004faa025c52fbadfd1eb08ed6344cc2ae5a164ffb88cc6a6d0" gracePeriod=30 Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.274119 4755 generic.go:334] "Generic (PLEG): container finished" podID="b85a22d9-d3c4-4c6a-8646-9337230979b5" containerID="3e09f4345ce1d004faa025c52fbadfd1eb08ed6344cc2ae5a164ffb88cc6a6d0" exitCode=0 Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.274172 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" event={"ID":"b85a22d9-d3c4-4c6a-8646-9337230979b5","Type":"ContainerDied","Data":"3e09f4345ce1d004faa025c52fbadfd1eb08ed6344cc2ae5a164ffb88cc6a6d0"} Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.368736 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.444090 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles\") pod \"b85a22d9-d3c4-4c6a-8646-9337230979b5\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.444465 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert\") pod \"b85a22d9-d3c4-4c6a-8646-9337230979b5\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.444505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config\") pod \"b85a22d9-d3c4-4c6a-8646-9337230979b5\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.444544 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njrpg\" (UniqueName: \"kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg\") pod \"b85a22d9-d3c4-4c6a-8646-9337230979b5\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.444600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca\") pod \"b85a22d9-d3c4-4c6a-8646-9337230979b5\" (UID: \"b85a22d9-d3c4-4c6a-8646-9337230979b5\") " Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.445129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config" (OuterVolumeSpecName: "config") pod "b85a22d9-d3c4-4c6a-8646-9337230979b5" (UID: "b85a22d9-d3c4-4c6a-8646-9337230979b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.445273 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca" (OuterVolumeSpecName: "client-ca") pod "b85a22d9-d3c4-4c6a-8646-9337230979b5" (UID: "b85a22d9-d3c4-4c6a-8646-9337230979b5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.445329 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b85a22d9-d3c4-4c6a-8646-9337230979b5" (UID: "b85a22d9-d3c4-4c6a-8646-9337230979b5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.450552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b85a22d9-d3c4-4c6a-8646-9337230979b5" (UID: "b85a22d9-d3c4-4c6a-8646-9337230979b5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.453089 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg" (OuterVolumeSpecName: "kube-api-access-njrpg") pod "b85a22d9-d3c4-4c6a-8646-9337230979b5" (UID: "b85a22d9-d3c4-4c6a-8646-9337230979b5"). InnerVolumeSpecName "kube-api-access-njrpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.546489 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.546532 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b85a22d9-d3c4-4c6a-8646-9337230979b5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.546545 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.546557 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njrpg\" (UniqueName: \"kubernetes.io/projected/b85a22d9-d3c4-4c6a-8646-9337230979b5-kube-api-access-njrpg\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.547190 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b85a22d9-d3c4-4c6a-8646-9337230979b5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.615019 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.615068 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.659538 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.840624 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:33 crc kubenswrapper[4755]: I0317 00:28:33.840968 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.290053 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.290100 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd9d8574d-4flnm" event={"ID":"b85a22d9-d3c4-4c6a-8646-9337230979b5","Type":"ContainerDied","Data":"102c3c69f432670a04031155d19ceeb46429c1ebfe239bfe8a32af215d133d8c"} Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.290174 4755 scope.go:117] "RemoveContainer" containerID="3e09f4345ce1d004faa025c52fbadfd1eb08ed6344cc2ae5a164ffb88cc6a6d0" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.323066 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.335121 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bd9d8574d-4flnm"] Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.368111 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p727d" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.763352 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ff9f6669c-4p66m"] Mar 17 00:28:34 crc kubenswrapper[4755]: E0317 00:28:34.763798 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85a22d9-d3c4-4c6a-8646-9337230979b5" containerName="controller-manager" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.763842 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85a22d9-d3c4-4c6a-8646-9337230979b5" containerName="controller-manager" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.764117 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85a22d9-d3c4-4c6a-8646-9337230979b5" containerName="controller-manager" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.764798 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.768526 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.769204 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.769659 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.770028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.770114 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.775668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.775738 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.780137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ff9f6669c-4p66m"] Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.867084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-client-ca\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.867153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774bc20a-a417-4408-8288-4dd3306bae28-serving-cert\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.867385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-proxy-ca-bundles\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.867508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5bc\" (UniqueName: \"kubernetes.io/projected/774bc20a-a417-4408-8288-4dd3306bae28-kube-api-access-ml5bc\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.867558 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-config\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.898293 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5g7nk" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="registry-server" probeResult="failure" output=< Mar 17 00:28:34 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:28:34 crc kubenswrapper[4755]: > Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.969043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774bc20a-a417-4408-8288-4dd3306bae28-serving-cert\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.969093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-client-ca\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.969145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-proxy-ca-bundles\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.969186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5bc\" (UniqueName: \"kubernetes.io/projected/774bc20a-a417-4408-8288-4dd3306bae28-kube-api-access-ml5bc\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.969212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-config\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.970311 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-client-ca\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.971065 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-config\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.972612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/774bc20a-a417-4408-8288-4dd3306bae28-proxy-ca-bundles\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.976133 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774bc20a-a417-4408-8288-4dd3306bae28-serving-cert\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:34 crc kubenswrapper[4755]: I0317 00:28:34.991140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5bc\" (UniqueName: \"kubernetes.io/projected/774bc20a-a417-4408-8288-4dd3306bae28-kube-api-access-ml5bc\") pod \"controller-manager-5ff9f6669c-4p66m\" (UID: \"774bc20a-a417-4408-8288-4dd3306bae28\") " pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:35 crc kubenswrapper[4755]: I0317 00:28:35.087141 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:35 crc kubenswrapper[4755]: I0317 00:28:35.310120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ff9f6669c-4p66m"] Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.034612 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.035151 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.077618 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.256864 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85a22d9-d3c4-4c6a-8646-9337230979b5" path="/var/lib/kubelet/pods/b85a22d9-d3c4-4c6a-8646-9337230979b5/volumes" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.303470 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" event={"ID":"774bc20a-a417-4408-8288-4dd3306bae28","Type":"ContainerStarted","Data":"91817f737cb0086cc6b70a4a44634c0b52ee00e8f2138ac1f77278af7edbe9d1"} Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.303568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" event={"ID":"774bc20a-a417-4408-8288-4dd3306bae28","Type":"ContainerStarted","Data":"a167de144fa2d8edbda33b255098fa5737b456a03b57b00511147bfa2ea02830"} Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.322533 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.322760 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.327593 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" podStartSLOduration=4.3275624520000004 podStartE2EDuration="4.327562452s" podCreationTimestamp="2026-03-17 00:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:28:36.319366704 +0000 UTC m=+391.078819027" watchObservedRunningTime="2026-03-17 00:28:36.327562452 +0000 UTC m=+391.087014825" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.348123 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 00:28:36 crc kubenswrapper[4755]: I0317 00:28:36.386095 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:37 crc kubenswrapper[4755]: I0317 00:28:37.309200 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:37 crc kubenswrapper[4755]: I0317 00:28:37.314859 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ff9f6669c-4p66m" Mar 17 00:28:37 crc kubenswrapper[4755]: I0317 00:28:37.357889 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:28:43 crc kubenswrapper[4755]: I0317 00:28:43.898796 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:43 crc kubenswrapper[4755]: I0317 00:28:43.964425 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 00:28:44 crc kubenswrapper[4755]: I0317 00:28:44.520425 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" podUID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" containerName="registry" containerID="cri-o://449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e" gracePeriod=30 Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:44.968265 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129302 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129376 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf8sz\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129414 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129478 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.129674 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\" (UID: \"79a4ec38-af00-43b6-bd22-d3ff75e52d71\") " Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.130840 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.130879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.137269 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.137547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz" (OuterVolumeSpecName: "kube-api-access-nf8sz") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "kube-api-access-nf8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.138250 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.138644 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.146206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.147369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "79a4ec38-af00-43b6-bd22-d3ff75e52d71" (UID: "79a4ec38-af00-43b6-bd22-d3ff75e52d71"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231657 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231931 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf8sz\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-kube-api-access-nf8sz\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231942 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79a4ec38-af00-43b6-bd22-d3ff75e52d71-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231950 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79a4ec38-af00-43b6-bd22-d3ff75e52d71-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231959 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231970 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79a4ec38-af00-43b6-bd22-d3ff75e52d71-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.231978 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79a4ec38-af00-43b6-bd22-d3ff75e52d71-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.365903 4755 generic.go:334] "Generic (PLEG): container finished" podID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" containerID="449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e" exitCode=0 Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.365957 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.365955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" event={"ID":"79a4ec38-af00-43b6-bd22-d3ff75e52d71","Type":"ContainerDied","Data":"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e"} Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.366011 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hg2fb" event={"ID":"79a4ec38-af00-43b6-bd22-d3ff75e52d71","Type":"ContainerDied","Data":"91095fea73cf220a693002c8a8b9db3510f3f4da47abc5183096c5bf33b91f8a"} Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.366048 4755 scope.go:117] "RemoveContainer" containerID="449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.388458 4755 scope.go:117] "RemoveContainer" containerID="449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e" Mar 17 00:28:45 crc kubenswrapper[4755]: E0317 00:28:45.389142 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e\": container with ID starting with 449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e not found: ID does not exist" containerID="449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.389201 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e"} err="failed to get container status \"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e\": rpc error: code = NotFound desc = could not find container \"449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e\": container with ID starting with 449358e39530725502f9c8201a87f28c32dec55551e1a8375d2689f2fb125a5e not found: ID does not exist" Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.417135 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:28:45 crc kubenswrapper[4755]: I0317 00:28:45.425689 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hg2fb"] Mar 17 00:28:46 crc kubenswrapper[4755]: I0317 00:28:46.257891 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" path="/var/lib/kubelet/pods/79a4ec38-af00-43b6-bd22-d3ff75e52d71/volumes" Mar 17 00:29:58 crc kubenswrapper[4755]: I0317 00:29:58.665041 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:29:58 crc kubenswrapper[4755]: I0317 00:29:58.665569 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.144084 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561790-lhmb8"] Mar 17 00:30:00 crc kubenswrapper[4755]: E0317 00:30:00.144666 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" containerName="registry" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.144681 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" containerName="registry" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.144772 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a4ec38-af00-43b6-bd22-d3ff75e52d71" containerName="registry" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.145218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.147411 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.147729 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm"] Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.147847 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.147933 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.148511 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.149802 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.150225 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.157152 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561790-lhmb8"] Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.160782 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm"] Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.228643 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhst\" (UniqueName: \"kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.228784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fft9q\" (UniqueName: \"kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q\") pod \"auto-csr-approver-29561790-lhmb8\" (UID: \"fadb568f-9e01-4202-ba32-afd95b7b1328\") " pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.228878 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.228959 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.330628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhst\" (UniqueName: \"kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.330729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fft9q\" (UniqueName: \"kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q\") pod \"auto-csr-approver-29561790-lhmb8\" (UID: \"fadb568f-9e01-4202-ba32-afd95b7b1328\") " pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.330862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.330956 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.332793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.347197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.360723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fft9q\" (UniqueName: \"kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q\") pod \"auto-csr-approver-29561790-lhmb8\" (UID: \"fadb568f-9e01-4202-ba32-afd95b7b1328\") " pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.363206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhst\" (UniqueName: \"kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst\") pod \"collect-profiles-29561790-5bdnm\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.493794 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.502830 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.960011 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561790-lhmb8"] Mar 17 00:30:00 crc kubenswrapper[4755]: I0317 00:30:00.987583 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:30:01 crc kubenswrapper[4755]: I0317 00:30:01.027958 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm"] Mar 17 00:30:01 crc kubenswrapper[4755]: W0317 00:30:01.034529 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a3a21f_65f3_4591_9b48_b640c5e264be.slice/crio-675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f WatchSource:0}: Error finding container 675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f: Status 404 returned error can't find the container with id 675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f Mar 17 00:30:01 crc kubenswrapper[4755]: I0317 00:30:01.858057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" event={"ID":"fadb568f-9e01-4202-ba32-afd95b7b1328","Type":"ContainerStarted","Data":"69be5bddb8ad9de4ce0b166b692221c19482cd01a0ee002a006ad1da6b7dc9a8"} Mar 17 00:30:01 crc kubenswrapper[4755]: I0317 00:30:01.859964 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5a3a21f-65f3-4591-9b48-b640c5e264be" containerID="f4825087d18115201122d7b8eabd20ef85e8900736fe86837741eda28e4608ca" exitCode=0 Mar 17 00:30:01 crc kubenswrapper[4755]: I0317 00:30:01.859999 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" event={"ID":"d5a3a21f-65f3-4591-9b48-b640c5e264be","Type":"ContainerDied","Data":"f4825087d18115201122d7b8eabd20ef85e8900736fe86837741eda28e4608ca"} Mar 17 00:30:01 crc kubenswrapper[4755]: I0317 00:30:01.860017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" event={"ID":"d5a3a21f-65f3-4591-9b48-b640c5e264be","Type":"ContainerStarted","Data":"675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f"} Mar 17 00:30:02 crc kubenswrapper[4755]: I0317 00:30:02.869778 4755 generic.go:334] "Generic (PLEG): container finished" podID="fadb568f-9e01-4202-ba32-afd95b7b1328" containerID="138b1e1cf02623e0bee47e61a2c1af0eef5fe4c9f421b4a1256fb2c7d5e16b4c" exitCode=0 Mar 17 00:30:02 crc kubenswrapper[4755]: I0317 00:30:02.869839 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" event={"ID":"fadb568f-9e01-4202-ba32-afd95b7b1328","Type":"ContainerDied","Data":"138b1e1cf02623e0bee47e61a2c1af0eef5fe4c9f421b4a1256fb2c7d5e16b4c"} Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.182516 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.278781 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume\") pod \"d5a3a21f-65f3-4591-9b48-b640c5e264be\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.278988 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume\") pod \"d5a3a21f-65f3-4591-9b48-b640c5e264be\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.279064 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhst\" (UniqueName: \"kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst\") pod \"d5a3a21f-65f3-4591-9b48-b640c5e264be\" (UID: \"d5a3a21f-65f3-4591-9b48-b640c5e264be\") " Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.279265 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5a3a21f-65f3-4591-9b48-b640c5e264be" (UID: "d5a3a21f-65f3-4591-9b48-b640c5e264be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.279604 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a3a21f-65f3-4591-9b48-b640c5e264be-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.289375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5a3a21f-65f3-4591-9b48-b640c5e264be" (UID: "d5a3a21f-65f3-4591-9b48-b640c5e264be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.289727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst" (OuterVolumeSpecName: "kube-api-access-lzhst") pod "d5a3a21f-65f3-4591-9b48-b640c5e264be" (UID: "d5a3a21f-65f3-4591-9b48-b640c5e264be"). InnerVolumeSpecName "kube-api-access-lzhst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.380677 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a3a21f-65f3-4591-9b48-b640c5e264be-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.380726 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhst\" (UniqueName: \"kubernetes.io/projected/d5a3a21f-65f3-4591-9b48-b640c5e264be-kube-api-access-lzhst\") on node \"crc\" DevicePath \"\"" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.886095 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.886187 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm" event={"ID":"d5a3a21f-65f3-4591-9b48-b640c5e264be","Type":"ContainerDied","Data":"675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f"} Mar 17 00:30:03 crc kubenswrapper[4755]: I0317 00:30:03.886645 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="675ddadad753f00b624e0e0c45dad8bc9b205f84fb062559d9344ffde2180a8f" Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.229239 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.291305 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fft9q\" (UniqueName: \"kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q\") pod \"fadb568f-9e01-4202-ba32-afd95b7b1328\" (UID: \"fadb568f-9e01-4202-ba32-afd95b7b1328\") " Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.295694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q" (OuterVolumeSpecName: "kube-api-access-fft9q") pod "fadb568f-9e01-4202-ba32-afd95b7b1328" (UID: "fadb568f-9e01-4202-ba32-afd95b7b1328"). InnerVolumeSpecName "kube-api-access-fft9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.392565 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fft9q\" (UniqueName: \"kubernetes.io/projected/fadb568f-9e01-4202-ba32-afd95b7b1328-kube-api-access-fft9q\") on node \"crc\" DevicePath \"\"" Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.897601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" event={"ID":"fadb568f-9e01-4202-ba32-afd95b7b1328","Type":"ContainerDied","Data":"69be5bddb8ad9de4ce0b166b692221c19482cd01a0ee002a006ad1da6b7dc9a8"} Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.897665 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69be5bddb8ad9de4ce0b166b692221c19482cd01a0ee002a006ad1da6b7dc9a8" Mar 17 00:30:04 crc kubenswrapper[4755]: I0317 00:30:04.897677 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561790-lhmb8" Mar 17 00:30:05 crc kubenswrapper[4755]: I0317 00:30:05.295267 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561784-zznp2"] Mar 17 00:30:05 crc kubenswrapper[4755]: I0317 00:30:05.301460 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561784-zznp2"] Mar 17 00:30:06 crc kubenswrapper[4755]: I0317 00:30:06.259242 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46784e1-420c-4d3b-aca7-65271a898c44" path="/var/lib/kubelet/pods/d46784e1-420c-4d3b-aca7-65271a898c44/volumes" Mar 17 00:30:28 crc kubenswrapper[4755]: I0317 00:30:28.665866 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:30:28 crc kubenswrapper[4755]: I0317 00:30:28.666478 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:30:58 crc kubenswrapper[4755]: I0317 00:30:58.665770 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:30:58 crc kubenswrapper[4755]: I0317 00:30:58.666609 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:30:58 crc kubenswrapper[4755]: I0317 00:30:58.666686 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:30:58 crc kubenswrapper[4755]: I0317 00:30:58.667579 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:30:58 crc kubenswrapper[4755]: I0317 00:30:58.667679 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e" gracePeriod=600 Mar 17 00:30:59 crc kubenswrapper[4755]: I0317 00:30:59.791693 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e" exitCode=0 Mar 17 00:30:59 crc kubenswrapper[4755]: I0317 00:30:59.791797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e"} Mar 17 00:30:59 crc kubenswrapper[4755]: I0317 00:30:59.792566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675"} Mar 17 00:30:59 crc kubenswrapper[4755]: I0317 00:30:59.792611 4755 scope.go:117] "RemoveContainer" containerID="5e4bcf70529050e2d5a4eb77278af6ddc216afe724345c57887569e664d73b74" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.147983 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561792-fc6gf"] Mar 17 00:32:00 crc kubenswrapper[4755]: E0317 00:32:00.148796 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fadb568f-9e01-4202-ba32-afd95b7b1328" containerName="oc" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.148812 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fadb568f-9e01-4202-ba32-afd95b7b1328" containerName="oc" Mar 17 00:32:00 crc kubenswrapper[4755]: E0317 00:32:00.148829 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a3a21f-65f3-4591-9b48-b640c5e264be" containerName="collect-profiles" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.148838 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a3a21f-65f3-4591-9b48-b640c5e264be" containerName="collect-profiles" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.148964 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a3a21f-65f3-4591-9b48-b640c5e264be" containerName="collect-profiles" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.148979 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fadb568f-9e01-4202-ba32-afd95b7b1328" containerName="oc" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.149369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.153737 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.153898 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.153743 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.171504 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561792-fc6gf"] Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.189244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmgnx\" (UniqueName: \"kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx\") pod \"auto-csr-approver-29561792-fc6gf\" (UID: \"503c259e-cfa2-4b5d-a866-83a78794df20\") " pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.290617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmgnx\" (UniqueName: \"kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx\") pod \"auto-csr-approver-29561792-fc6gf\" (UID: \"503c259e-cfa2-4b5d-a866-83a78794df20\") " pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.310385 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmgnx\" (UniqueName: \"kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx\") pod \"auto-csr-approver-29561792-fc6gf\" (UID: \"503c259e-cfa2-4b5d-a866-83a78794df20\") " pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.469987 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:00 crc kubenswrapper[4755]: I0317 00:32:00.781031 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561792-fc6gf"] Mar 17 00:32:01 crc kubenswrapper[4755]: I0317 00:32:01.195926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" event={"ID":"503c259e-cfa2-4b5d-a866-83a78794df20","Type":"ContainerStarted","Data":"14fe91b454532a988ff9bfe2246a7d91ab13a3dcf6c9e4cb3ccf7b51ac9b5e87"} Mar 17 00:32:03 crc kubenswrapper[4755]: I0317 00:32:03.213684 4755 generic.go:334] "Generic (PLEG): container finished" podID="503c259e-cfa2-4b5d-a866-83a78794df20" containerID="09331908c036e53ce5955a3f651296527cc0fc2cd6b6aae8569f51800528cb91" exitCode=0 Mar 17 00:32:03 crc kubenswrapper[4755]: I0317 00:32:03.213773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" event={"ID":"503c259e-cfa2-4b5d-a866-83a78794df20","Type":"ContainerDied","Data":"09331908c036e53ce5955a3f651296527cc0fc2cd6b6aae8569f51800528cb91"} Mar 17 00:32:04 crc kubenswrapper[4755]: I0317 00:32:04.537889 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:04 crc kubenswrapper[4755]: I0317 00:32:04.650635 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmgnx\" (UniqueName: \"kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx\") pod \"503c259e-cfa2-4b5d-a866-83a78794df20\" (UID: \"503c259e-cfa2-4b5d-a866-83a78794df20\") " Mar 17 00:32:04 crc kubenswrapper[4755]: I0317 00:32:04.657842 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx" (OuterVolumeSpecName: "kube-api-access-gmgnx") pod "503c259e-cfa2-4b5d-a866-83a78794df20" (UID: "503c259e-cfa2-4b5d-a866-83a78794df20"). InnerVolumeSpecName "kube-api-access-gmgnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:32:04 crc kubenswrapper[4755]: I0317 00:32:04.752082 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmgnx\" (UniqueName: \"kubernetes.io/projected/503c259e-cfa2-4b5d-a866-83a78794df20-kube-api-access-gmgnx\") on node \"crc\" DevicePath \"\"" Mar 17 00:32:05 crc kubenswrapper[4755]: I0317 00:32:05.227540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" event={"ID":"503c259e-cfa2-4b5d-a866-83a78794df20","Type":"ContainerDied","Data":"14fe91b454532a988ff9bfe2246a7d91ab13a3dcf6c9e4cb3ccf7b51ac9b5e87"} Mar 17 00:32:05 crc kubenswrapper[4755]: I0317 00:32:05.227588 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fe91b454532a988ff9bfe2246a7d91ab13a3dcf6c9e4cb3ccf7b51ac9b5e87" Mar 17 00:32:05 crc kubenswrapper[4755]: I0317 00:32:05.227664 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561792-fc6gf" Mar 17 00:32:05 crc kubenswrapper[4755]: I0317 00:32:05.612991 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561786-84lbh"] Mar 17 00:32:05 crc kubenswrapper[4755]: I0317 00:32:05.620054 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561786-84lbh"] Mar 17 00:32:06 crc kubenswrapper[4755]: I0317 00:32:06.261681 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73f0adb-2ef2-4e25-91ba-f29aa35939bf" path="/var/lib/kubelet/pods/a73f0adb-2ef2-4e25-91ba-f29aa35939bf/volumes" Mar 17 00:32:22 crc kubenswrapper[4755]: I0317 00:32:22.617192 4755 scope.go:117] "RemoveContainer" containerID="dfa84351c1713b5382135a808f7dcca826e88836f4d77efd458030120a543c18" Mar 17 00:32:22 crc kubenswrapper[4755]: I0317 00:32:22.668835 4755 scope.go:117] "RemoveContainer" containerID="354d4e1b72a1530e0c66af0cb2a4ee896017daf25f188f1f009ef30c226bda25" Mar 17 00:32:58 crc kubenswrapper[4755]: I0317 00:32:58.665420 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:32:58 crc kubenswrapper[4755]: I0317 00:32:58.666604 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:33:28 crc kubenswrapper[4755]: I0317 00:33:28.665629 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:33:28 crc kubenswrapper[4755]: I0317 00:33:28.666261 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:33:58 crc kubenswrapper[4755]: I0317 00:33:58.665209 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:33:58 crc kubenswrapper[4755]: I0317 00:33:58.665905 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:33:58 crc kubenswrapper[4755]: I0317 00:33:58.665973 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:33:58 crc kubenswrapper[4755]: I0317 00:33:58.666745 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:33:58 crc kubenswrapper[4755]: I0317 00:33:58.666874 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675" gracePeriod=600 Mar 17 00:33:59 crc kubenswrapper[4755]: I0317 00:33:59.033679 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675" exitCode=0 Mar 17 00:33:59 crc kubenswrapper[4755]: I0317 00:33:59.033784 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675"} Mar 17 00:33:59 crc kubenswrapper[4755]: I0317 00:33:59.034049 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1"} Mar 17 00:33:59 crc kubenswrapper[4755]: I0317 00:33:59.034081 4755 scope.go:117] "RemoveContainer" containerID="3bdb3baf5ba6a3ef3e039e15bc705bbe9296b04b0aef97e3d166af9a7f44368e" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.151212 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561794-8fnz7"] Mar 17 00:34:00 crc kubenswrapper[4755]: E0317 00:34:00.151702 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503c259e-cfa2-4b5d-a866-83a78794df20" containerName="oc" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.151736 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="503c259e-cfa2-4b5d-a866-83a78794df20" containerName="oc" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.152080 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="503c259e-cfa2-4b5d-a866-83a78794df20" containerName="oc" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.152944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.155958 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.157738 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.159680 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.164899 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561794-8fnz7"] Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.278936 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntz6v\" (UniqueName: \"kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v\") pod \"auto-csr-approver-29561794-8fnz7\" (UID: \"48a3a05f-68ae-4201-b5aa-051d129d70fd\") " pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.380639 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntz6v\" (UniqueName: \"kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v\") pod \"auto-csr-approver-29561794-8fnz7\" (UID: \"48a3a05f-68ae-4201-b5aa-051d129d70fd\") " pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.414329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntz6v\" (UniqueName: \"kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v\") pod \"auto-csr-approver-29561794-8fnz7\" (UID: \"48a3a05f-68ae-4201-b5aa-051d129d70fd\") " pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.473950 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:00 crc kubenswrapper[4755]: I0317 00:34:00.932726 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561794-8fnz7"] Mar 17 00:34:01 crc kubenswrapper[4755]: I0317 00:34:01.054925 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" event={"ID":"48a3a05f-68ae-4201-b5aa-051d129d70fd","Type":"ContainerStarted","Data":"cbd2df234c163ae6e091e86bc3a8c7056e02e1b24a9108dda8396f4b17a91c88"} Mar 17 00:34:03 crc kubenswrapper[4755]: I0317 00:34:03.075215 4755 generic.go:334] "Generic (PLEG): container finished" podID="48a3a05f-68ae-4201-b5aa-051d129d70fd" containerID="db55da9b29e5486aa78520af96ac7dda3f64cf4fd57482e95c39d81714c01ac9" exitCode=0 Mar 17 00:34:03 crc kubenswrapper[4755]: I0317 00:34:03.075351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" event={"ID":"48a3a05f-68ae-4201-b5aa-051d129d70fd","Type":"ContainerDied","Data":"db55da9b29e5486aa78520af96ac7dda3f64cf4fd57482e95c39d81714c01ac9"} Mar 17 00:34:04 crc kubenswrapper[4755]: I0317 00:34:04.422187 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:04 crc kubenswrapper[4755]: I0317 00:34:04.539077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntz6v\" (UniqueName: \"kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v\") pod \"48a3a05f-68ae-4201-b5aa-051d129d70fd\" (UID: \"48a3a05f-68ae-4201-b5aa-051d129d70fd\") " Mar 17 00:34:04 crc kubenswrapper[4755]: I0317 00:34:04.546699 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v" (OuterVolumeSpecName: "kube-api-access-ntz6v") pod "48a3a05f-68ae-4201-b5aa-051d129d70fd" (UID: "48a3a05f-68ae-4201-b5aa-051d129d70fd"). InnerVolumeSpecName "kube-api-access-ntz6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:34:04 crc kubenswrapper[4755]: I0317 00:34:04.641670 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntz6v\" (UniqueName: \"kubernetes.io/projected/48a3a05f-68ae-4201-b5aa-051d129d70fd-kube-api-access-ntz6v\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:05 crc kubenswrapper[4755]: I0317 00:34:05.098191 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" event={"ID":"48a3a05f-68ae-4201-b5aa-051d129d70fd","Type":"ContainerDied","Data":"cbd2df234c163ae6e091e86bc3a8c7056e02e1b24a9108dda8396f4b17a91c88"} Mar 17 00:34:05 crc kubenswrapper[4755]: I0317 00:34:05.098253 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd2df234c163ae6e091e86bc3a8c7056e02e1b24a9108dda8396f4b17a91c88" Mar 17 00:34:05 crc kubenswrapper[4755]: I0317 00:34:05.098263 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561794-8fnz7" Mar 17 00:34:05 crc kubenswrapper[4755]: I0317 00:34:05.488829 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561788-jl6wp"] Mar 17 00:34:05 crc kubenswrapper[4755]: I0317 00:34:05.493045 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561788-jl6wp"] Mar 17 00:34:06 crc kubenswrapper[4755]: I0317 00:34:06.257829 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579fbdac-fb6c-4aaf-9c5d-a24494764bb0" path="/var/lib/kubelet/pods/579fbdac-fb6c-4aaf-9c5d-a24494764bb0/volumes" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.016051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c"] Mar 17 00:34:14 crc kubenswrapper[4755]: E0317 00:34:14.016815 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a3a05f-68ae-4201-b5aa-051d129d70fd" containerName="oc" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.016830 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a3a05f-68ae-4201-b5aa-051d129d70fd" containerName="oc" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.016972 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a3a05f-68ae-4201-b5aa-051d129d70fd" containerName="oc" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.017974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.019931 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.036778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c"] Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.166598 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.166704 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfds4\" (UniqueName: \"kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.166751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.267656 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfds4\" (UniqueName: \"kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.267708 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.267733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.268199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.268325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.302524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfds4\" (UniqueName: \"kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.333161 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:14 crc kubenswrapper[4755]: I0317 00:34:14.558717 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c"] Mar 17 00:34:15 crc kubenswrapper[4755]: I0317 00:34:15.181106 4755 generic.go:334] "Generic (PLEG): container finished" podID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerID="3c308ae8e084c8e64a0c1a692b16d58384a42e3a1f33bc44517d9cd08cc75d92" exitCode=0 Mar 17 00:34:15 crc kubenswrapper[4755]: I0317 00:34:15.181182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" event={"ID":"f6050c97-e228-485e-9b2e-e04588fff1aa","Type":"ContainerDied","Data":"3c308ae8e084c8e64a0c1a692b16d58384a42e3a1f33bc44517d9cd08cc75d92"} Mar 17 00:34:15 crc kubenswrapper[4755]: I0317 00:34:15.181230 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" event={"ID":"f6050c97-e228-485e-9b2e-e04588fff1aa","Type":"ContainerStarted","Data":"0cbb395007e7a2069232871b2eaf24742c07bf66dd05fb5df1a8991b45faca17"} Mar 17 00:34:17 crc kubenswrapper[4755]: I0317 00:34:17.195181 4755 generic.go:334] "Generic (PLEG): container finished" podID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerID="a6c6d07fe2393811a0eab208c398b095b671c2a5ed246c46b1db3a95c8cb5e6d" exitCode=0 Mar 17 00:34:17 crc kubenswrapper[4755]: I0317 00:34:17.195260 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" event={"ID":"f6050c97-e228-485e-9b2e-e04588fff1aa","Type":"ContainerDied","Data":"a6c6d07fe2393811a0eab208c398b095b671c2a5ed246c46b1db3a95c8cb5e6d"} Mar 17 00:34:18 crc kubenswrapper[4755]: I0317 00:34:18.205841 4755 generic.go:334] "Generic (PLEG): container finished" podID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerID="f4e60818cb86f2d3d827478b458ffebc89dd16d20dac3b78141bc8f46c0bc25c" exitCode=0 Mar 17 00:34:18 crc kubenswrapper[4755]: I0317 00:34:18.206026 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" event={"ID":"f6050c97-e228-485e-9b2e-e04588fff1aa","Type":"ContainerDied","Data":"f4e60818cb86f2d3d827478b458ffebc89dd16d20dac3b78141bc8f46c0bc25c"} Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.482775 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.545683 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util\") pod \"f6050c97-e228-485e-9b2e-e04588fff1aa\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.545787 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle\") pod \"f6050c97-e228-485e-9b2e-e04588fff1aa\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.545844 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfds4\" (UniqueName: \"kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4\") pod \"f6050c97-e228-485e-9b2e-e04588fff1aa\" (UID: \"f6050c97-e228-485e-9b2e-e04588fff1aa\") " Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.551108 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle" (OuterVolumeSpecName: "bundle") pod "f6050c97-e228-485e-9b2e-e04588fff1aa" (UID: "f6050c97-e228-485e-9b2e-e04588fff1aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.555351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4" (OuterVolumeSpecName: "kube-api-access-lfds4") pod "f6050c97-e228-485e-9b2e-e04588fff1aa" (UID: "f6050c97-e228-485e-9b2e-e04588fff1aa"). InnerVolumeSpecName "kube-api-access-lfds4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.563366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util" (OuterVolumeSpecName: "util") pod "f6050c97-e228-485e-9b2e-e04588fff1aa" (UID: "f6050c97-e228-485e-9b2e-e04588fff1aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.647464 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.647493 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6050c97-e228-485e-9b2e-e04588fff1aa-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:19 crc kubenswrapper[4755]: I0317 00:34:19.647502 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfds4\" (UniqueName: \"kubernetes.io/projected/f6050c97-e228-485e-9b2e-e04588fff1aa-kube-api-access-lfds4\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:20 crc kubenswrapper[4755]: I0317 00:34:20.225310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" event={"ID":"f6050c97-e228-485e-9b2e-e04588fff1aa","Type":"ContainerDied","Data":"0cbb395007e7a2069232871b2eaf24742c07bf66dd05fb5df1a8991b45faca17"} Mar 17 00:34:20 crc kubenswrapper[4755]: I0317 00:34:20.225499 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cbb395007e7a2069232871b2eaf24742c07bf66dd05fb5df1a8991b45faca17" Mar 17 00:34:20 crc kubenswrapper[4755]: I0317 00:34:20.225501 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c" Mar 17 00:34:22 crc kubenswrapper[4755]: I0317 00:34:22.811607 4755 scope.go:117] "RemoveContainer" containerID="b1ec5d6a4fdc24f25074bca55cbf7a7a4d1f47dd11b3a80d740f8f71cab41481" Mar 17 00:34:22 crc kubenswrapper[4755]: I0317 00:34:22.847721 4755 scope.go:117] "RemoveContainer" containerID="c77a2693c8b2f4db6f385019aa4755e0e81742d0ac28b0a839ffa1c32e9fc2b4" Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.132331 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvdzt"] Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.132705 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-controller" containerID="cri-o://18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133047 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="sbdb" containerID="cri-o://76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133089 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="nbdb" containerID="cri-o://51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133119 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="northd" containerID="cri-o://c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133151 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133179 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-node" containerID="cri-o://e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.133205 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-acl-logging" containerID="cri-o://bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.197933 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovnkube-controller" containerID="cri-o://4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" gracePeriod=30 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.260240 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j6qtr_de2167ca-ad7e-47ce-bf95-cebc396df145/kube-multus/0.log" Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.260289 4755 generic.go:334] "Generic (PLEG): container finished" podID="de2167ca-ad7e-47ce-bf95-cebc396df145" containerID="ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193" exitCode=2 Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.260321 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6qtr" event={"ID":"de2167ca-ad7e-47ce-bf95-cebc396df145","Type":"ContainerDied","Data":"ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193"} Mar 17 00:34:25 crc kubenswrapper[4755]: I0317 00:34:25.261093 4755 scope.go:117] "RemoveContainer" containerID="ed1b171a612cb7fd08acee7f0e0e954a81b0f06f559944540278904d38764193" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.054655 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvdzt_44d329be-573d-4143-97fb-d07ed343c898/ovn-acl-logging/0.log" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.058802 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvdzt_44d329be-573d-4143-97fb-d07ed343c898/ovn-controller/0.log" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.062690 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137209 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w9dgl"] Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137463 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137479 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137491 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kubecfg-setup" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137498 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kubecfg-setup" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137505 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="extract" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137512 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="extract" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137522 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137529 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137537 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="nbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="nbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137556 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-node" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137563 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-node" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137575 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="northd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137582 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="northd" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137595 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="util" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137601 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="util" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137613 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="sbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137620 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="sbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137629 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-acl-logging" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137636 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-acl-logging" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137646 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="pull" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137653 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="pull" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.137663 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovnkube-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137669 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovnkube-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137774 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="sbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137790 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137800 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovn-acl-logging" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137810 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6050c97-e228-485e-9b2e-e04588fff1aa" containerName="extract" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137818 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="ovnkube-controller" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137827 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="northd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137835 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-node" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137844 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.137853 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d329be-573d-4143-97fb-d07ed343c898" containerName="nbdb" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.139607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152748 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152811 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152882 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152901 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152925 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.152993 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153059 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153095 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153122 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153145 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153162 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6cxf\" (UniqueName: \"kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153237 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153269 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd\") pod \"44d329be-573d-4143-97fb-d07ed343c898\" (UID: \"44d329be-573d-4143-97fb-d07ed343c898\") " Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153582 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153612 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket" (OuterVolumeSpecName: "log-socket") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153643 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153651 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153703 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153724 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash" (OuterVolumeSpecName: "host-slash") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153745 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153765 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153767 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153782 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153799 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153824 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153827 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log" (OuterVolumeSpecName: "node-log") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.153989 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.154007 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.159477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.160169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf" (OuterVolumeSpecName: "kube-api-access-v6cxf") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "kube-api-access-v6cxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.176261 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "44d329be-573d-4143-97fb-d07ed343c898" (UID: "44d329be-573d-4143-97fb-d07ed343c898"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-slash\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-etc-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254565 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-script-lib\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-var-lib-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254633 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-node-log\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254683 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/080519a0-e656-4098-9fdb-472f391e0bfb-ovn-node-metrics-cert\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-log-socket\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-config\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254738 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-systemd-units\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254754 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-netd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-ovn\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-systemd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254843 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9n95\" (UniqueName: \"kubernetes.io/projected/080519a0-e656-4098-9fdb-472f391e0bfb-kube-api-access-f9n95\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-netns\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-kubelet\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.254989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-bin\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-env-overrides\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255121 4755 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-log-socket\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255144 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255154 4755 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255163 4755 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255176 4755 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255189 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255201 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255213 4755 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-slash\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255224 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255235 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255243 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44d329be-573d-4143-97fb-d07ed343c898-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255251 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255259 4755 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-node-log\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255267 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6cxf\" (UniqueName: \"kubernetes.io/projected/44d329be-573d-4143-97fb-d07ed343c898-kube-api-access-v6cxf\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255276 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44d329be-573d-4143-97fb-d07ed343c898-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255283 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255292 4755 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255299 4755 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255307 4755 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.255315 4755 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44d329be-573d-4143-97fb-d07ed343c898-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.267346 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j6qtr_de2167ca-ad7e-47ce-bf95-cebc396df145/kube-multus/0.log" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.267426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6qtr" event={"ID":"de2167ca-ad7e-47ce-bf95-cebc396df145","Type":"ContainerStarted","Data":"b3a202747a1b5771af1fb3cf9251531b8899dd6b2bcb10c4cd341d091aa7d729"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.270680 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvdzt_44d329be-573d-4143-97fb-d07ed343c898/ovn-acl-logging/0.log" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271230 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mvdzt_44d329be-573d-4143-97fb-d07ed343c898/ovn-controller/0.log" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271653 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271731 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271794 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271850 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271907 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271957 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" exitCode=0 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272011 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" exitCode=143 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272062 4755 generic.go:334] "Generic (PLEG): container finished" podID="44d329be-573d-4143-97fb-d07ed343c898" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" exitCode=143 Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271738 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.271740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272303 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272364 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272370 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272375 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272480 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272513 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272519 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272525 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272531 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272536 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272541 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272546 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272552 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272557 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272563 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272571 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272577 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272584 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272589 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272593 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272598 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272603 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272609 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272614 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mvdzt" event={"ID":"44d329be-573d-4143-97fb-d07ed343c898","Type":"ContainerDied","Data":"298e6b83a9fd3017c1ae942a1bb9db3386c157608c11d666d65352005c431160"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272631 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272638 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272644 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272650 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272655 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272659 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272664 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272668 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.272673 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.286791 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.313814 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvdzt"] Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.313920 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.317400 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mvdzt"] Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.341549 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.356543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/080519a0-e656-4098-9fdb-472f391e0bfb-ovn-node-metrics-cert\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.356909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-log-socket\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.356963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-config\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.356996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-systemd-units\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-netd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-ovn\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-systemd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357157 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357208 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357248 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9n95\" (UniqueName: \"kubernetes.io/projected/080519a0-e656-4098-9fdb-472f391e0bfb-kube-api-access-f9n95\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357323 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-netns\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-kubelet\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357389 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-bin\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-env-overrides\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-slash\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-systemd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357614 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-etc-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357709 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-etc-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357739 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-slash\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357783 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357786 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-log-socket\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-script-lib\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-systemd-units\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357927 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-run-netns\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-netd\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.357955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-kubelet\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358081 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-ovn\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-run-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358246 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-host-cni-bin\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-var-lib-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-node-log\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358427 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-var-lib-openvswitch\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.358464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/080519a0-e656-4098-9fdb-472f391e0bfb-node-log\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.359300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-script-lib\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.362769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-env-overrides\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.363297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/080519a0-e656-4098-9fdb-472f391e0bfb-ovnkube-config\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.365818 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.380409 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/080519a0-e656-4098-9fdb-472f391e0bfb-ovn-node-metrics-cert\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.392589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9n95\" (UniqueName: \"kubernetes.io/projected/080519a0-e656-4098-9fdb-472f391e0bfb-kube-api-access-f9n95\") pod \"ovnkube-node-w9dgl\" (UID: \"080519a0-e656-4098-9fdb-472f391e0bfb\") " pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.396667 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.428580 4755 scope.go:117] "RemoveContainer" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.456715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.475603 4755 scope.go:117] "RemoveContainer" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.519593 4755 scope.go:117] "RemoveContainer" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: W0317 00:34:26.541878 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080519a0_e656_4098_9fdb_472f391e0bfb.slice/crio-847c8f8e9a200f7376858de1c8b5d6409936f310c0ec87c131f0eee049d8e3bf WatchSource:0}: Error finding container 847c8f8e9a200f7376858de1c8b5d6409936f310c0ec87c131f0eee049d8e3bf: Status 404 returned error can't find the container with id 847c8f8e9a200f7376858de1c8b5d6409936f310c0ec87c131f0eee049d8e3bf Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.557542 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.557958 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558000 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} err="failed to get container status \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558033 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.558411 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558458 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} err="failed to get container status \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558478 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.558768 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558816 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} err="failed to get container status \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.558851 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.559103 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559132 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} err="failed to get container status \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559146 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.559317 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559362 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} err="failed to get container status \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559379 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.559560 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559581 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} err="failed to get container status \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559598 4755 scope.go:117] "RemoveContainer" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.559752 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": container with ID starting with bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4 not found: ID does not exist" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559773 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} err="failed to get container status \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": rpc error: code = NotFound desc = could not find container \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": container with ID starting with bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559784 4755 scope.go:117] "RemoveContainer" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.559938 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": container with ID starting with 18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750 not found: ID does not exist" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559959 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} err="failed to get container status \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": rpc error: code = NotFound desc = could not find container \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": container with ID starting with 18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.559970 4755 scope.go:117] "RemoveContainer" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: E0317 00:34:26.560123 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": container with ID starting with 73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e not found: ID does not exist" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560143 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} err="failed to get container status \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": rpc error: code = NotFound desc = could not find container \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": container with ID starting with 73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560156 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560305 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} err="failed to get container status \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560323 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560483 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} err="failed to get container status \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560503 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560689 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} err="failed to get container status \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560728 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560912 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} err="failed to get container status \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.560931 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561148 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} err="failed to get container status \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561171 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561329 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} err="failed to get container status \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561348 4755 scope.go:117] "RemoveContainer" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561514 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} err="failed to get container status \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": rpc error: code = NotFound desc = could not find container \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": container with ID starting with bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561532 4755 scope.go:117] "RemoveContainer" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561696 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} err="failed to get container status \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": rpc error: code = NotFound desc = could not find container \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": container with ID starting with 18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561719 4755 scope.go:117] "RemoveContainer" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561867 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} err="failed to get container status \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": rpc error: code = NotFound desc = could not find container \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": container with ID starting with 73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.561884 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562032 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} err="failed to get container status \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562049 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562191 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} err="failed to get container status \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562210 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562348 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} err="failed to get container status \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562365 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562530 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} err="failed to get container status \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562549 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562709 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} err="failed to get container status \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562726 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562870 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} err="failed to get container status \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.562889 4755 scope.go:117] "RemoveContainer" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563059 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} err="failed to get container status \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": rpc error: code = NotFound desc = could not find container \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": container with ID starting with bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563082 4755 scope.go:117] "RemoveContainer" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563244 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} err="failed to get container status \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": rpc error: code = NotFound desc = could not find container \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": container with ID starting with 18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563262 4755 scope.go:117] "RemoveContainer" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563465 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} err="failed to get container status \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": rpc error: code = NotFound desc = could not find container \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": container with ID starting with 73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563489 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563655 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} err="failed to get container status \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563673 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563818 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} err="failed to get container status \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563837 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.563999 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} err="failed to get container status \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564017 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564167 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} err="failed to get container status \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564184 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564355 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} err="failed to get container status \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564375 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564586 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} err="failed to get container status \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564608 4755 scope.go:117] "RemoveContainer" containerID="bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564863 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4"} err="failed to get container status \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": rpc error: code = NotFound desc = could not find container \"bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4\": container with ID starting with bb19273d39dd64a616839e9acf7b15da0a734b3b671209f5a8790dd16a3c1ec4 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.564894 4755 scope.go:117] "RemoveContainer" containerID="18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.571965 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750"} err="failed to get container status \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": rpc error: code = NotFound desc = could not find container \"18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750\": container with ID starting with 18baa736c1dccd0e46648190b9e00d3f0a19c55a28102a8d1da283b94c083750 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572004 4755 scope.go:117] "RemoveContainer" containerID="73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572279 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e"} err="failed to get container status \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": rpc error: code = NotFound desc = could not find container \"73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e\": container with ID starting with 73a22814aacce4f614995e398fe27fb1839a44ab0a5cf8dcd017ca894cd6db1e not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572324 4755 scope.go:117] "RemoveContainer" containerID="4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572562 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56"} err="failed to get container status \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": rpc error: code = NotFound desc = could not find container \"4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56\": container with ID starting with 4279f62241f615525a3d6f12fd552dae2b0247503daf45242e6a273fe7501c56 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572581 4755 scope.go:117] "RemoveContainer" containerID="76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572738 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1"} err="failed to get container status \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": rpc error: code = NotFound desc = could not find container \"76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1\": container with ID starting with 76fef97df1e83a39db7a94e2e3028842fdfe5b87ffa0ccdcb49eaa6650df72f1 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572757 4755 scope.go:117] "RemoveContainer" containerID="51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572919 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03"} err="failed to get container status \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": rpc error: code = NotFound desc = could not find container \"51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03\": container with ID starting with 51f5cb9375e59a4bed9340e28f948f0022ceb7a6b0d0221e91956acce010ff03 not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.572937 4755 scope.go:117] "RemoveContainer" containerID="c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.573111 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd"} err="failed to get container status \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": rpc error: code = NotFound desc = could not find container \"c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd\": container with ID starting with c9fc25e0e8af7c1ba6eb96bb96262f6e1eed580593b775f640ace2a2358c73fd not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.573134 4755 scope.go:117] "RemoveContainer" containerID="37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.573312 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f"} err="failed to get container status \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": rpc error: code = NotFound desc = could not find container \"37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f\": container with ID starting with 37e995cb3ba0e24bf317bf7269ad1db28b69875b64bba96d263d26d628a54a7f not found: ID does not exist" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.573331 4755 scope.go:117] "RemoveContainer" containerID="e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c" Mar 17 00:34:26 crc kubenswrapper[4755]: I0317 00:34:26.573506 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c"} err="failed to get container status \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": rpc error: code = NotFound desc = could not find container \"e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c\": container with ID starting with e6e2d8038510ddbc87750a2b624466ae545b625ffe6f42552cd3fc6c42bef39c not found: ID does not exist" Mar 17 00:34:27 crc kubenswrapper[4755]: I0317 00:34:27.279832 4755 generic.go:334] "Generic (PLEG): container finished" podID="080519a0-e656-4098-9fdb-472f391e0bfb" containerID="6d663e2f083cb94c27fd7a78dd17caf9a868b81735b6ec9e4046aabd96be6fbb" exitCode=0 Mar 17 00:34:27 crc kubenswrapper[4755]: I0317 00:34:27.279898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerDied","Data":"6d663e2f083cb94c27fd7a78dd17caf9a868b81735b6ec9e4046aabd96be6fbb"} Mar 17 00:34:27 crc kubenswrapper[4755]: I0317 00:34:27.279929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"847c8f8e9a200f7376858de1c8b5d6409936f310c0ec87c131f0eee049d8e3bf"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.263745 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d329be-573d-4143-97fb-d07ed343c898" path="/var/lib/kubelet/pods/44d329be-573d-4143-97fb-d07ed343c898/volumes" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"4178e525cd5c492d55fc061c3b059cf1202a281dd822421059629ee149f50720"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"7f891ac599ceda94636e0aac874626569b163fc8e88185b675c20250e0a07898"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"4083fbc2176fef4ddb24e78a5bcded62f93b90664c698c69e4e74d96a298e59d"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"1524623d1640c025e5c6fb09f472c225ba7a7b59c54f9a03427e7242940637f1"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"4a7ad807442c01ad65e41f4aab46db6227d13eb9349960838aba751efc7d0aab"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.291871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"2b8df542bababc1abdb1a9da0a671a539f1c10a90d104373fdb411be1ee6932d"} Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.393776 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr"] Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.394386 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.396311 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rglzn" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.396879 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.397225 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.495961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdj2\" (UniqueName: \"kubernetes.io/projected/26acf5e2-72ee-4d4c-b25b-9d641f0a42df-kube-api-access-vgdj2\") pod \"obo-prometheus-operator-68bc856cb9-bvjlr\" (UID: \"26acf5e2-72ee-4d4c-b25b-9d641f0a42df\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.517959 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk"] Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.518873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.520991 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.521490 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-jkdc5" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.528997 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs"] Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.529785 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.596955 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdj2\" (UniqueName: \"kubernetes.io/projected/26acf5e2-72ee-4d4c-b25b-9d641f0a42df-kube-api-access-vgdj2\") pod \"obo-prometheus-operator-68bc856cb9-bvjlr\" (UID: \"26acf5e2-72ee-4d4c-b25b-9d641f0a42df\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.622277 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdj2\" (UniqueName: \"kubernetes.io/projected/26acf5e2-72ee-4d4c-b25b-9d641f0a42df-kube-api-access-vgdj2\") pod \"obo-prometheus-operator-68bc856cb9-bvjlr\" (UID: \"26acf5e2-72ee-4d4c-b25b-9d641f0a42df\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.627823 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4w5t6"] Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.628446 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.631216 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-9kvlg" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.636250 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.697747 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.697828 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.697869 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.697913 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.708601 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.723390 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-czwsf"] Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.724075 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.728655 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lfvct" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.740620 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(4d772f9b05091b6bf1bf65ea4bb3589465ae93ec84092c25f08fd6f03eea9013): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.740698 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(4d772f9b05091b6bf1bf65ea4bb3589465ae93ec84092c25f08fd6f03eea9013): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.740724 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(4d772f9b05091b6bf1bf65ea4bb3589465ae93ec84092c25f08fd6f03eea9013): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.740774 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators(26acf5e2-72ee-4d4c-b25b-9d641f0a42df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators(26acf5e2-72ee-4d4c-b25b-9d641f0a42df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(4d772f9b05091b6bf1bf65ea4bb3589465ae93ec84092c25f08fd6f03eea9013): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" podUID="26acf5e2-72ee-4d4c-b25b-9d641f0a42df" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.799428 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.799564 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.799589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2lxx\" (UniqueName: \"kubernetes.io/projected/24b6289e-88b6-4958-9ce1-539cecddbd1f-kube-api-access-r2lxx\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.800084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.800133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b6289e-88b6-4958-9ce1-539cecddbd1f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.800154 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.802070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.802401 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5530d5-e196-42d5-b0b9-c089b13d97a8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs\" (UID: \"9d5530d5-e196-42d5-b0b9-c089b13d97a8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.802767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.812337 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk\" (UID: \"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.833552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.852431 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.857484 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(395eb8ad9c21ed3e2fc7f2668a7e9fde841badf54c78b70de70aed8cd29c058a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.857548 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(395eb8ad9c21ed3e2fc7f2668a7e9fde841badf54c78b70de70aed8cd29c058a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.857579 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(395eb8ad9c21ed3e2fc7f2668a7e9fde841badf54c78b70de70aed8cd29c058a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.857629 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators(a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators(a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(395eb8ad9c21ed3e2fc7f2668a7e9fde841badf54c78b70de70aed8cd29c058a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" podUID="a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.919666 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b6289e-88b6-4958-9ce1-539cecddbd1f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.919744 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70c01555-4d7f-426f-a9a5-fd21462252dc-openshift-service-ca\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.919806 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktcx\" (UniqueName: \"kubernetes.io/projected/70c01555-4d7f-426f-a9a5-fd21462252dc-kube-api-access-nktcx\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.919866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2lxx\" (UniqueName: \"kubernetes.io/projected/24b6289e-88b6-4958-9ce1-539cecddbd1f-kube-api-access-r2lxx\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.923146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b6289e-88b6-4958-9ce1-539cecddbd1f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.925770 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(2ee9df2cdd5a06180d0253ab1df90820cbee00e99224a66fdf557a2c971b9b0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.925819 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(2ee9df2cdd5a06180d0253ab1df90820cbee00e99224a66fdf557a2c971b9b0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.925837 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(2ee9df2cdd5a06180d0253ab1df90820cbee00e99224a66fdf557a2c971b9b0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.925906 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators(9d5530d5-e196-42d5-b0b9-c089b13d97a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators(9d5530d5-e196-42d5-b0b9-c089b13d97a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(2ee9df2cdd5a06180d0253ab1df90820cbee00e99224a66fdf557a2c971b9b0c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" podUID="9d5530d5-e196-42d5-b0b9-c089b13d97a8" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.945134 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2lxx\" (UniqueName: \"kubernetes.io/projected/24b6289e-88b6-4958-9ce1-539cecddbd1f-kube-api-access-r2lxx\") pod \"observability-operator-59bdc8b94-4w5t6\" (UID: \"24b6289e-88b6-4958-9ce1-539cecddbd1f\") " pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: I0317 00:34:28.958840 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.982486 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(40cacbc66d683fcba10f88b56aa6d85ba8a1497b77ce035b89a49e14559d6725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.982569 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(40cacbc66d683fcba10f88b56aa6d85ba8a1497b77ce035b89a49e14559d6725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.982594 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(40cacbc66d683fcba10f88b56aa6d85ba8a1497b77ce035b89a49e14559d6725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:28 crc kubenswrapper[4755]: E0317 00:34:28.982642 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-4w5t6_openshift-operators(24b6289e-88b6-4958-9ce1-539cecddbd1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-4w5t6_openshift-operators(24b6289e-88b6-4958-9ce1-539cecddbd1f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(40cacbc66d683fcba10f88b56aa6d85ba8a1497b77ce035b89a49e14559d6725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" podUID="24b6289e-88b6-4958-9ce1-539cecddbd1f" Mar 17 00:34:29 crc kubenswrapper[4755]: I0317 00:34:29.021180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70c01555-4d7f-426f-a9a5-fd21462252dc-openshift-service-ca\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: I0317 00:34:29.021302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktcx\" (UniqueName: \"kubernetes.io/projected/70c01555-4d7f-426f-a9a5-fd21462252dc-kube-api-access-nktcx\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: I0317 00:34:29.022871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70c01555-4d7f-426f-a9a5-fd21462252dc-openshift-service-ca\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: I0317 00:34:29.038078 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktcx\" (UniqueName: \"kubernetes.io/projected/70c01555-4d7f-426f-a9a5-fd21462252dc-kube-api-access-nktcx\") pod \"perses-operator-5bf474d74f-czwsf\" (UID: \"70c01555-4d7f-426f-a9a5-fd21462252dc\") " pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: I0317 00:34:29.079238 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: E0317 00:34:29.105163 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(30550d3d3bc23a743e28a6b6ddf82e541fa402f929afa666414051ccedeb8cf0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:29 crc kubenswrapper[4755]: E0317 00:34:29.105250 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(30550d3d3bc23a743e28a6b6ddf82e541fa402f929afa666414051ccedeb8cf0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: E0317 00:34:29.105331 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(30550d3d3bc23a743e28a6b6ddf82e541fa402f929afa666414051ccedeb8cf0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:29 crc kubenswrapper[4755]: E0317 00:34:29.105404 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-czwsf_openshift-operators(70c01555-4d7f-426f-a9a5-fd21462252dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-czwsf_openshift-operators(70c01555-4d7f-426f-a9a5-fd21462252dc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(30550d3d3bc23a743e28a6b6ddf82e541fa402f929afa666414051ccedeb8cf0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" podUID="70c01555-4d7f-426f-a9a5-fd21462252dc" Mar 17 00:34:30 crc kubenswrapper[4755]: I0317 00:34:30.305108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"c722e187bd96f528eadd5bc9da116a15a51a8ec594b300a12b1ca6d2a57a5fe9"} Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.328272 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" event={"ID":"080519a0-e656-4098-9fdb-472f391e0bfb","Type":"ContainerStarted","Data":"a3904c5388768985dfd1d0355d52c13135b9496ff3930ca6a59871704ae5e4f9"} Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.328946 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.328970 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.328985 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.337565 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-czwsf"] Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.337705 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.338143 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.351199 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr"] Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.351345 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.351863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.370926 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4w5t6"] Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.371056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.371479 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.379364 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs"] Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.379768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.380249 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.384510 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk"] Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.384628 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.385007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.387667 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" podStartSLOduration=7.387654737 podStartE2EDuration="7.387654737s" podCreationTimestamp="2026-03-17 00:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:34:33.381815765 +0000 UTC m=+748.141268048" watchObservedRunningTime="2026-03-17 00:34:33.387654737 +0000 UTC m=+748.147107020" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.392609 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:33 crc kubenswrapper[4755]: I0317 00:34:33.412916 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.442387 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(c796bded63733b9800bf8b0d54b9751c7a4e5dd82ac92109d06923841efb6a43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.442484 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(c796bded63733b9800bf8b0d54b9751c7a4e5dd82ac92109d06923841efb6a43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.442512 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(c796bded63733b9800bf8b0d54b9751c7a4e5dd82ac92109d06923841efb6a43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.442614 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators(26acf5e2-72ee-4d4c-b25b-9d641f0a42df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators(26acf5e2-72ee-4d4c-b25b-9d641f0a42df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-bvjlr_openshift-operators_26acf5e2-72ee-4d4c-b25b-9d641f0a42df_0(c796bded63733b9800bf8b0d54b9751c7a4e5dd82ac92109d06923841efb6a43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" podUID="26acf5e2-72ee-4d4c-b25b-9d641f0a42df" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.473021 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(eab204faae914640dcabac1ecc267d5c36a3b5b8eb148373fd4cb8854c8dc9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.473077 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(eab204faae914640dcabac1ecc267d5c36a3b5b8eb148373fd4cb8854c8dc9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.473098 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(eab204faae914640dcabac1ecc267d5c36a3b5b8eb148373fd4cb8854c8dc9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.473140 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-czwsf_openshift-operators(70c01555-4d7f-426f-a9a5-fd21462252dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-czwsf_openshift-operators(70c01555-4d7f-426f-a9a5-fd21462252dc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-czwsf_openshift-operators_70c01555-4d7f-426f-a9a5-fd21462252dc_0(eab204faae914640dcabac1ecc267d5c36a3b5b8eb148373fd4cb8854c8dc9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" podUID="70c01555-4d7f-426f-a9a5-fd21462252dc" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.481940 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(60e65d2ff3ac574c877407ca4ef0be5e41604d92ba8efda81eb6f42d1fa45c32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.482005 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(60e65d2ff3ac574c877407ca4ef0be5e41604d92ba8efda81eb6f42d1fa45c32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.482026 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(60e65d2ff3ac574c877407ca4ef0be5e41604d92ba8efda81eb6f42d1fa45c32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.482078 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-4w5t6_openshift-operators(24b6289e-88b6-4958-9ce1-539cecddbd1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-4w5t6_openshift-operators(24b6289e-88b6-4958-9ce1-539cecddbd1f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-4w5t6_openshift-operators_24b6289e-88b6-4958-9ce1-539cecddbd1f_0(60e65d2ff3ac574c877407ca4ef0be5e41604d92ba8efda81eb6f42d1fa45c32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" podUID="24b6289e-88b6-4958-9ce1-539cecddbd1f" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.493295 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(725d181c93d2462e88d807522c75245434634c33136910c7f2f2ee722bb31662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.493426 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(725d181c93d2462e88d807522c75245434634c33136910c7f2f2ee722bb31662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.493562 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(725d181c93d2462e88d807522c75245434634c33136910c7f2f2ee722bb31662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.493677 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators(9d5530d5-e196-42d5-b0b9-c089b13d97a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators(9d5530d5-e196-42d5-b0b9-c089b13d97a8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_openshift-operators_9d5530d5-e196-42d5-b0b9-c089b13d97a8_0(725d181c93d2462e88d807522c75245434634c33136910c7f2f2ee722bb31662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" podUID="9d5530d5-e196-42d5-b0b9-c089b13d97a8" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.499652 4755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(f09d880d3ff527e383cca0a56c01720502665ac4917d5b0a1640eee8c3fdb920): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.499845 4755 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(f09d880d3ff527e383cca0a56c01720502665ac4917d5b0a1640eee8c3fdb920): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.499918 4755 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(f09d880d3ff527e383cca0a56c01720502665ac4917d5b0a1640eee8c3fdb920): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:33 crc kubenswrapper[4755]: E0317 00:34:33.500020 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators(a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators(a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_openshift-operators_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1_0(f09d880d3ff527e383cca0a56c01720502665ac4917d5b0a1640eee8c3fdb920): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" podUID="a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1" Mar 17 00:34:45 crc kubenswrapper[4755]: I0317 00:34:45.248009 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:45 crc kubenswrapper[4755]: I0317 00:34:45.249040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" Mar 17 00:34:45 crc kubenswrapper[4755]: I0317 00:34:45.521410 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs"] Mar 17 00:34:45 crc kubenswrapper[4755]: W0317 00:34:45.526654 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5530d5_e196_42d5_b0b9_c089b13d97a8.slice/crio-6b8fb859d1e62c8585aa8cb9d7c3376e3dbdfd517cac46c2a59c351b97e4252f WatchSource:0}: Error finding container 6b8fb859d1e62c8585aa8cb9d7c3376e3dbdfd517cac46c2a59c351b97e4252f: Status 404 returned error can't find the container with id 6b8fb859d1e62c8585aa8cb9d7c3376e3dbdfd517cac46c2a59c351b97e4252f Mar 17 00:34:46 crc kubenswrapper[4755]: I0317 00:34:46.247246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:46 crc kubenswrapper[4755]: I0317 00:34:46.252915 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" Mar 17 00:34:46 crc kubenswrapper[4755]: I0317 00:34:46.406660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" event={"ID":"9d5530d5-e196-42d5-b0b9-c089b13d97a8","Type":"ContainerStarted","Data":"6b8fb859d1e62c8585aa8cb9d7c3376e3dbdfd517cac46c2a59c351b97e4252f"} Mar 17 00:34:46 crc kubenswrapper[4755]: I0317 00:34:46.579921 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr"] Mar 17 00:34:46 crc kubenswrapper[4755]: W0317 00:34:46.585990 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26acf5e2_72ee_4d4c_b25b_9d641f0a42df.slice/crio-d0259fb4a2a05d2d81c6867d08d5471510eab743bbe929dc1c8af53d30d6974c WatchSource:0}: Error finding container d0259fb4a2a05d2d81c6867d08d5471510eab743bbe929dc1c8af53d30d6974c: Status 404 returned error can't find the container with id d0259fb4a2a05d2d81c6867d08d5471510eab743bbe929dc1c8af53d30d6974c Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.248400 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.249064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.249179 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.249327 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.417905 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" event={"ID":"26acf5e2-72ee-4d4c-b25b-9d641f0a42df","Type":"ContainerStarted","Data":"d0259fb4a2a05d2d81c6867d08d5471510eab743bbe929dc1c8af53d30d6974c"} Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.510948 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4w5t6"] Mar 17 00:34:47 crc kubenswrapper[4755]: W0317 00:34:47.526658 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b6289e_88b6_4958_9ce1_539cecddbd1f.slice/crio-605721d155cd4b813c7fefd052b6906e1b8885e564c910ed98802a0a4c937228 WatchSource:0}: Error finding container 605721d155cd4b813c7fefd052b6906e1b8885e564c910ed98802a0a4c937228: Status 404 returned error can't find the container with id 605721d155cd4b813c7fefd052b6906e1b8885e564c910ed98802a0a4c937228 Mar 17 00:34:47 crc kubenswrapper[4755]: I0317 00:34:47.577861 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk"] Mar 17 00:34:48 crc kubenswrapper[4755]: I0317 00:34:48.431352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" event={"ID":"24b6289e-88b6-4958-9ce1-539cecddbd1f","Type":"ContainerStarted","Data":"605721d155cd4b813c7fefd052b6906e1b8885e564c910ed98802a0a4c937228"} Mar 17 00:34:49 crc kubenswrapper[4755]: I0317 00:34:49.248341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:49 crc kubenswrapper[4755]: I0317 00:34:49.248694 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:49 crc kubenswrapper[4755]: W0317 00:34:49.340840 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24f3f6d_abc7_4fc8_b0c1_609a0fdb55c1.slice/crio-a6560ad97afb7868eb05d1e8744aac4c261dc0dfdc0d70a01c8a2bc53455ece4 WatchSource:0}: Error finding container a6560ad97afb7868eb05d1e8744aac4c261dc0dfdc0d70a01c8a2bc53455ece4: Status 404 returned error can't find the container with id a6560ad97afb7868eb05d1e8744aac4c261dc0dfdc0d70a01c8a2bc53455ece4 Mar 17 00:34:49 crc kubenswrapper[4755]: I0317 00:34:49.436966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" event={"ID":"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1","Type":"ContainerStarted","Data":"a6560ad97afb7868eb05d1e8744aac4c261dc0dfdc0d70a01c8a2bc53455ece4"} Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.164785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-czwsf"] Mar 17 00:34:51 crc kubenswrapper[4755]: W0317 00:34:51.173169 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c01555_4d7f_426f_a9a5_fd21462252dc.slice/crio-f54a7cc6bc50679bf747824478ddbaaaad0d0b13333269be79a1554a6cdf8344 WatchSource:0}: Error finding container f54a7cc6bc50679bf747824478ddbaaaad0d0b13333269be79a1554a6cdf8344: Status 404 returned error can't find the container with id f54a7cc6bc50679bf747824478ddbaaaad0d0b13333269be79a1554a6cdf8344 Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.453873 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" event={"ID":"26acf5e2-72ee-4d4c-b25b-9d641f0a42df","Type":"ContainerStarted","Data":"995866ee91107fa55deba3eeefefaf97a9946215d93b7fe2a92766b848e64dee"} Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.458377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" event={"ID":"a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1","Type":"ContainerStarted","Data":"5f25ccfc13d1aeb9089dfef4f94bdd598a05d9461eb856596cf94c6f290fda08"} Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.460287 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" event={"ID":"9d5530d5-e196-42d5-b0b9-c089b13d97a8","Type":"ContainerStarted","Data":"b478db02ec62e5cfa22105b605c33109d7e5c6f640858b20c44d31b069ad1b0f"} Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.462204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" event={"ID":"70c01555-4d7f-426f-a9a5-fd21462252dc","Type":"ContainerStarted","Data":"f54a7cc6bc50679bf747824478ddbaaaad0d0b13333269be79a1554a6cdf8344"} Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.477415 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-bvjlr" podStartSLOduration=19.110936159 podStartE2EDuration="23.477401097s" podCreationTimestamp="2026-03-17 00:34:28 +0000 UTC" firstStartedPulling="2026-03-17 00:34:46.588937893 +0000 UTC m=+761.348390176" lastFinishedPulling="2026-03-17 00:34:50.955402831 +0000 UTC m=+765.714855114" observedRunningTime="2026-03-17 00:34:51.476533063 +0000 UTC m=+766.235985376" watchObservedRunningTime="2026-03-17 00:34:51.477401097 +0000 UTC m=+766.236853380" Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.504694 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-5xmqs" podStartSLOduration=18.078376354 podStartE2EDuration="23.504673935s" podCreationTimestamp="2026-03-17 00:34:28 +0000 UTC" firstStartedPulling="2026-03-17 00:34:45.52872096 +0000 UTC m=+760.288173243" lastFinishedPulling="2026-03-17 00:34:50.955018541 +0000 UTC m=+765.714470824" observedRunningTime="2026-03-17 00:34:51.501083166 +0000 UTC m=+766.260535459" watchObservedRunningTime="2026-03-17 00:34:51.504673935 +0000 UTC m=+766.264126228" Mar 17 00:34:51 crc kubenswrapper[4755]: I0317 00:34:51.565428 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-645f745c74-l8tqk" podStartSLOduration=21.951746171 podStartE2EDuration="23.565404156s" podCreationTimestamp="2026-03-17 00:34:28 +0000 UTC" firstStartedPulling="2026-03-17 00:34:49.343118924 +0000 UTC m=+764.102571207" lastFinishedPulling="2026-03-17 00:34:50.956776909 +0000 UTC m=+765.716229192" observedRunningTime="2026-03-17 00:34:51.562815234 +0000 UTC m=+766.322267537" watchObservedRunningTime="2026-03-17 00:34:51.565404156 +0000 UTC m=+766.324856459" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.482813 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w9dgl" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.498326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" event={"ID":"24b6289e-88b6-4958-9ce1-539cecddbd1f","Type":"ContainerStarted","Data":"88b4164c09a0e6f7559bdfdc6c8b16d5be097dfef9c2cab40212c9ae31ca42b7"} Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.498574 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.499455 4755 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-4w5t6 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.21:8081/healthz\": dial tcp 10.217.0.21:8081: connect: connection refused" start-of-body= Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.499490 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" podUID="24b6289e-88b6-4958-9ce1-539cecddbd1f" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.21:8081/healthz\": dial tcp 10.217.0.21:8081: connect: connection refused" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.500060 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" event={"ID":"70c01555-4d7f-426f-a9a5-fd21462252dc","Type":"ContainerStarted","Data":"ea2f624e93557f9f35e3005b6d595cd8fe40b85d24530096e35ffaaacdf9a180"} Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.500409 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.551417 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" podStartSLOduration=23.657115895 podStartE2EDuration="28.551400741s" podCreationTimestamp="2026-03-17 00:34:28 +0000 UTC" firstStartedPulling="2026-03-17 00:34:51.192518965 +0000 UTC m=+765.951971248" lastFinishedPulling="2026-03-17 00:34:56.086803811 +0000 UTC m=+770.846256094" observedRunningTime="2026-03-17 00:34:56.550896886 +0000 UTC m=+771.310349169" watchObservedRunningTime="2026-03-17 00:34:56.551400741 +0000 UTC m=+771.310853024" Mar 17 00:34:56 crc kubenswrapper[4755]: I0317 00:34:56.568089 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" podStartSLOduration=19.956482524 podStartE2EDuration="28.568065605s" podCreationTimestamp="2026-03-17 00:34:28 +0000 UTC" firstStartedPulling="2026-03-17 00:34:47.532546724 +0000 UTC m=+762.291999007" lastFinishedPulling="2026-03-17 00:34:56.144129795 +0000 UTC m=+770.903582088" observedRunningTime="2026-03-17 00:34:56.566167291 +0000 UTC m=+771.325619574" watchObservedRunningTime="2026-03-17 00:34:56.568065605 +0000 UTC m=+771.327517888" Mar 17 00:34:57 crc kubenswrapper[4755]: I0317 00:34:57.507575 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-4w5t6" Mar 17 00:35:01 crc kubenswrapper[4755]: I0317 00:35:01.747831 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.884408 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-676tx"] Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.890982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.895644 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n2p2k" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.895900 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.896135 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.898704 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-676tx"] Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.909841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7fzpr"] Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.911041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7fzpr" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.913386 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ks548" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.928007 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7fzpr"] Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.942484 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vtlkz"] Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.943225 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.944675 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m2rv8" Mar 17 00:35:07 crc kubenswrapper[4755]: I0317 00:35:07.953002 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vtlkz"] Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.068131 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckf4l\" (UniqueName: \"kubernetes.io/projected/6725f6d6-96db-4e53-b4b7-b14e32c3160d-kube-api-access-ckf4l\") pod \"cert-manager-webhook-687f57d79b-vtlkz\" (UID: \"6725f6d6-96db-4e53-b4b7-b14e32c3160d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.068285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbg6m\" (UniqueName: \"kubernetes.io/projected/1d430575-aa06-4c37-8262-d01a1d1766b7-kube-api-access-zbg6m\") pod \"cert-manager-858654f9db-7fzpr\" (UID: \"1d430575-aa06-4c37-8262-d01a1d1766b7\") " pod="cert-manager/cert-manager-858654f9db-7fzpr" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.068362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cshh2\" (UniqueName: \"kubernetes.io/projected/2d8af759-0406-4291-b488-291e4db0f5ff-kube-api-access-cshh2\") pod \"cert-manager-cainjector-cf98fcc89-676tx\" (UID: \"2d8af759-0406-4291-b488-291e4db0f5ff\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.169302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cshh2\" (UniqueName: \"kubernetes.io/projected/2d8af759-0406-4291-b488-291e4db0f5ff-kube-api-access-cshh2\") pod \"cert-manager-cainjector-cf98fcc89-676tx\" (UID: \"2d8af759-0406-4291-b488-291e4db0f5ff\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.169470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckf4l\" (UniqueName: \"kubernetes.io/projected/6725f6d6-96db-4e53-b4b7-b14e32c3160d-kube-api-access-ckf4l\") pod \"cert-manager-webhook-687f57d79b-vtlkz\" (UID: \"6725f6d6-96db-4e53-b4b7-b14e32c3160d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.169537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbg6m\" (UniqueName: \"kubernetes.io/projected/1d430575-aa06-4c37-8262-d01a1d1766b7-kube-api-access-zbg6m\") pod \"cert-manager-858654f9db-7fzpr\" (UID: \"1d430575-aa06-4c37-8262-d01a1d1766b7\") " pod="cert-manager/cert-manager-858654f9db-7fzpr" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.195969 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbg6m\" (UniqueName: \"kubernetes.io/projected/1d430575-aa06-4c37-8262-d01a1d1766b7-kube-api-access-zbg6m\") pod \"cert-manager-858654f9db-7fzpr\" (UID: \"1d430575-aa06-4c37-8262-d01a1d1766b7\") " pod="cert-manager/cert-manager-858654f9db-7fzpr" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.199381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckf4l\" (UniqueName: \"kubernetes.io/projected/6725f6d6-96db-4e53-b4b7-b14e32c3160d-kube-api-access-ckf4l\") pod \"cert-manager-webhook-687f57d79b-vtlkz\" (UID: \"6725f6d6-96db-4e53-b4b7-b14e32c3160d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.203381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cshh2\" (UniqueName: \"kubernetes.io/projected/2d8af759-0406-4291-b488-291e4db0f5ff-kube-api-access-cshh2\") pod \"cert-manager-cainjector-cf98fcc89-676tx\" (UID: \"2d8af759-0406-4291-b488-291e4db0f5ff\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.215330 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.231507 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7fzpr" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.255903 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.636274 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vtlkz"] Mar 17 00:35:08 crc kubenswrapper[4755]: W0317 00:35:08.638187 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6725f6d6_96db_4e53_b4b7_b14e32c3160d.slice/crio-e326af6a0797f43459646d4c7380a21370d69dad6d35a8b94055286bd2efeb44 WatchSource:0}: Error finding container e326af6a0797f43459646d4c7380a21370d69dad6d35a8b94055286bd2efeb44: Status 404 returned error can't find the container with id e326af6a0797f43459646d4c7380a21370d69dad6d35a8b94055286bd2efeb44 Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.640379 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.708264 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-676tx"] Mar 17 00:35:08 crc kubenswrapper[4755]: W0317 00:35:08.709073 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d8af759_0406_4291_b488_291e4db0f5ff.slice/crio-3cb283ecb0609bfb3f799ceb174102adb82f0a7676affcac8a25e777a56795a1 WatchSource:0}: Error finding container 3cb283ecb0609bfb3f799ceb174102adb82f0a7676affcac8a25e777a56795a1: Status 404 returned error can't find the container with id 3cb283ecb0609bfb3f799ceb174102adb82f0a7676affcac8a25e777a56795a1 Mar 17 00:35:08 crc kubenswrapper[4755]: I0317 00:35:08.727275 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7fzpr"] Mar 17 00:35:09 crc kubenswrapper[4755]: I0317 00:35:09.082938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-czwsf" Mar 17 00:35:09 crc kubenswrapper[4755]: I0317 00:35:09.594919 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" event={"ID":"6725f6d6-96db-4e53-b4b7-b14e32c3160d","Type":"ContainerStarted","Data":"e326af6a0797f43459646d4c7380a21370d69dad6d35a8b94055286bd2efeb44"} Mar 17 00:35:09 crc kubenswrapper[4755]: I0317 00:35:09.596006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7fzpr" event={"ID":"1d430575-aa06-4c37-8262-d01a1d1766b7","Type":"ContainerStarted","Data":"04c3cccc5a0c3e8af1d1dd08ff5676217b52a595f8f2642a3174ebe232b6ec84"} Mar 17 00:35:09 crc kubenswrapper[4755]: I0317 00:35:09.596999 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" event={"ID":"2d8af759-0406-4291-b488-291e4db0f5ff","Type":"ContainerStarted","Data":"3cb283ecb0609bfb3f799ceb174102adb82f0a7676affcac8a25e777a56795a1"} Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.239133 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.241675 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.256697 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.408936 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6br\" (UniqueName: \"kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.408984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.409005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.510378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6br\" (UniqueName: \"kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.510429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.510468 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.510878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.510977 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.544246 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6br\" (UniqueName: \"kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br\") pod \"redhat-operators-8hwdb\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:10 crc kubenswrapper[4755]: I0317 00:35:10.567356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:11 crc kubenswrapper[4755]: I0317 00:35:11.086296 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:11 crc kubenswrapper[4755]: I0317 00:35:11.629932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerStarted","Data":"8ba9dfacdc6aaafe2ad0b5d05603fa25d6002d638531d1f3cc4135273c1be2c8"} Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.651194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" event={"ID":"6725f6d6-96db-4e53-b4b7-b14e32c3160d","Type":"ContainerStarted","Data":"aa01df466f86adf6b4f74736254a3302794534b00fdb4e1de4e8cfe8ecfdfa4c"} Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.651476 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.654429 4755 generic.go:334] "Generic (PLEG): container finished" podID="458a45de-5936-4300-8c92-5f54a3c6188a" containerID="409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193" exitCode=0 Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.654636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerDied","Data":"409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193"} Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.659467 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" event={"ID":"2d8af759-0406-4291-b488-291e4db0f5ff","Type":"ContainerStarted","Data":"f698a880eb0178d91c7edfb1e3b16262e7c43c71a0637f189a66ccb29a85ff2a"} Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.691563 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" podStartSLOduration=2.394960744 podStartE2EDuration="6.691545957s" podCreationTimestamp="2026-03-17 00:35:07 +0000 UTC" firstStartedPulling="2026-03-17 00:35:08.640084598 +0000 UTC m=+783.399536881" lastFinishedPulling="2026-03-17 00:35:12.936669811 +0000 UTC m=+787.696122094" observedRunningTime="2026-03-17 00:35:13.671660239 +0000 UTC m=+788.431112522" watchObservedRunningTime="2026-03-17 00:35:13.691545957 +0000 UTC m=+788.450998240" Mar 17 00:35:13 crc kubenswrapper[4755]: I0317 00:35:13.723760 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-676tx" podStartSLOduration=2.4952071670000002 podStartE2EDuration="6.723720962s" podCreationTimestamp="2026-03-17 00:35:07 +0000 UTC" firstStartedPulling="2026-03-17 00:35:08.711507096 +0000 UTC m=+783.470959379" lastFinishedPulling="2026-03-17 00:35:12.940020891 +0000 UTC m=+787.699473174" observedRunningTime="2026-03-17 00:35:13.718255427 +0000 UTC m=+788.477707710" watchObservedRunningTime="2026-03-17 00:35:13.723720962 +0000 UTC m=+788.483173255" Mar 17 00:35:15 crc kubenswrapper[4755]: I0317 00:35:15.675410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7fzpr" event={"ID":"1d430575-aa06-4c37-8262-d01a1d1766b7","Type":"ContainerStarted","Data":"3a73aac64b2964944f7398c3b7e995cfa34cf3ffea2d275e19f707d21ae04f64"} Mar 17 00:35:15 crc kubenswrapper[4755]: I0317 00:35:15.677560 4755 generic.go:334] "Generic (PLEG): container finished" podID="458a45de-5936-4300-8c92-5f54a3c6188a" containerID="1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea" exitCode=0 Mar 17 00:35:15 crc kubenswrapper[4755]: I0317 00:35:15.677591 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerDied","Data":"1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea"} Mar 17 00:35:15 crc kubenswrapper[4755]: I0317 00:35:15.698182 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7fzpr" podStartSLOduration=2.894872505 podStartE2EDuration="8.698151179s" podCreationTimestamp="2026-03-17 00:35:07 +0000 UTC" firstStartedPulling="2026-03-17 00:35:08.735951795 +0000 UTC m=+783.495404078" lastFinishedPulling="2026-03-17 00:35:14.539230459 +0000 UTC m=+789.298682752" observedRunningTime="2026-03-17 00:35:15.693355012 +0000 UTC m=+790.452807295" watchObservedRunningTime="2026-03-17 00:35:15.698151179 +0000 UTC m=+790.457603472" Mar 17 00:35:16 crc kubenswrapper[4755]: I0317 00:35:16.688967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerStarted","Data":"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5"} Mar 17 00:35:16 crc kubenswrapper[4755]: I0317 00:35:16.713432 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hwdb" podStartSLOduration=4.011843917 podStartE2EDuration="6.713417174s" podCreationTimestamp="2026-03-17 00:35:10 +0000 UTC" firstStartedPulling="2026-03-17 00:35:13.658406797 +0000 UTC m=+788.417859080" lastFinishedPulling="2026-03-17 00:35:16.359980044 +0000 UTC m=+791.119432337" observedRunningTime="2026-03-17 00:35:16.712822678 +0000 UTC m=+791.472274971" watchObservedRunningTime="2026-03-17 00:35:16.713417174 +0000 UTC m=+791.472869457" Mar 17 00:35:18 crc kubenswrapper[4755]: I0317 00:35:18.268869 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vtlkz" Mar 17 00:35:20 crc kubenswrapper[4755]: I0317 00:35:20.567952 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:20 crc kubenswrapper[4755]: I0317 00:35:20.569683 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:21 crc kubenswrapper[4755]: I0317 00:35:21.635336 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hwdb" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="registry-server" probeResult="failure" output=< Mar 17 00:35:21 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:35:21 crc kubenswrapper[4755]: > Mar 17 00:35:30 crc kubenswrapper[4755]: I0317 00:35:30.641782 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:30 crc kubenswrapper[4755]: I0317 00:35:30.713818 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:30 crc kubenswrapper[4755]: I0317 00:35:30.886259 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:31 crc kubenswrapper[4755]: I0317 00:35:31.788239 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hwdb" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="registry-server" containerID="cri-o://579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5" gracePeriod=2 Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.212291 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.399688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities\") pod \"458a45de-5936-4300-8c92-5f54a3c6188a\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.399893 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6br\" (UniqueName: \"kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br\") pod \"458a45de-5936-4300-8c92-5f54a3c6188a\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.400077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content\") pod \"458a45de-5936-4300-8c92-5f54a3c6188a\" (UID: \"458a45de-5936-4300-8c92-5f54a3c6188a\") " Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.401225 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities" (OuterVolumeSpecName: "utilities") pod "458a45de-5936-4300-8c92-5f54a3c6188a" (UID: "458a45de-5936-4300-8c92-5f54a3c6188a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.408616 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br" (OuterVolumeSpecName: "kube-api-access-vc6br") pod "458a45de-5936-4300-8c92-5f54a3c6188a" (UID: "458a45de-5936-4300-8c92-5f54a3c6188a"). InnerVolumeSpecName "kube-api-access-vc6br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.501616 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.501662 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6br\" (UniqueName: \"kubernetes.io/projected/458a45de-5936-4300-8c92-5f54a3c6188a-kube-api-access-vc6br\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.580312 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "458a45de-5936-4300-8c92-5f54a3c6188a" (UID: "458a45de-5936-4300-8c92-5f54a3c6188a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.603291 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/458a45de-5936-4300-8c92-5f54a3c6188a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.800842 4755 generic.go:334] "Generic (PLEG): container finished" podID="458a45de-5936-4300-8c92-5f54a3c6188a" containerID="579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5" exitCode=0 Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.800893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerDied","Data":"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5"} Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.800938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hwdb" event={"ID":"458a45de-5936-4300-8c92-5f54a3c6188a","Type":"ContainerDied","Data":"8ba9dfacdc6aaafe2ad0b5d05603fa25d6002d638531d1f3cc4135273c1be2c8"} Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.800960 4755 scope.go:117] "RemoveContainer" containerID="579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.800988 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hwdb" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.828066 4755 scope.go:117] "RemoveContainer" containerID="1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.849672 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.873495 4755 scope.go:117] "RemoveContainer" containerID="409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.879173 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hwdb"] Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.895778 4755 scope.go:117] "RemoveContainer" containerID="579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5" Mar 17 00:35:32 crc kubenswrapper[4755]: E0317 00:35:32.896456 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5\": container with ID starting with 579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5 not found: ID does not exist" containerID="579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.896514 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5"} err="failed to get container status \"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5\": rpc error: code = NotFound desc = could not find container \"579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5\": container with ID starting with 579c9f3098c953c86f2d77348fb57cff065660fc0e291980c06a19bb53f4cde5 not found: ID does not exist" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.896553 4755 scope.go:117] "RemoveContainer" containerID="1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea" Mar 17 00:35:32 crc kubenswrapper[4755]: E0317 00:35:32.897057 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea\": container with ID starting with 1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea not found: ID does not exist" containerID="1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.897106 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea"} err="failed to get container status \"1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea\": rpc error: code = NotFound desc = could not find container \"1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea\": container with ID starting with 1b166249258fdf0c364d31d5d5e90d12435a0083f1e2f2f492a292ebb27274ea not found: ID does not exist" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.897138 4755 scope.go:117] "RemoveContainer" containerID="409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193" Mar 17 00:35:32 crc kubenswrapper[4755]: E0317 00:35:32.897844 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193\": container with ID starting with 409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193 not found: ID does not exist" containerID="409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193" Mar 17 00:35:32 crc kubenswrapper[4755]: I0317 00:35:32.897915 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193"} err="failed to get container status \"409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193\": rpc error: code = NotFound desc = could not find container \"409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193\": container with ID starting with 409381ddbbd6181b911f7eb7f85a7680c899b93806ed8292a9abfb4ec69d2193 not found: ID does not exist" Mar 17 00:35:34 crc kubenswrapper[4755]: I0317 00:35:34.262832 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" path="/var/lib/kubelet/pods/458a45de-5936-4300-8c92-5f54a3c6188a/volumes" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.123928 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt"] Mar 17 00:35:43 crc kubenswrapper[4755]: E0317 00:35:43.124840 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="registry-server" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.124859 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="registry-server" Mar 17 00:35:43 crc kubenswrapper[4755]: E0317 00:35:43.124873 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="extract-content" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.124884 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="extract-content" Mar 17 00:35:43 crc kubenswrapper[4755]: E0317 00:35:43.124912 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="extract-utilities" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.124922 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="extract-utilities" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.125071 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="458a45de-5936-4300-8c92-5f54a3c6188a" containerName="registry-server" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.126101 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.128777 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.132824 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt"] Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.248009 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrsf\" (UniqueName: \"kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.248105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.248146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.349592 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrsf\" (UniqueName: \"kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.350314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.351993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.350903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.352409 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.374010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrsf\" (UniqueName: \"kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.444648 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.509002 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd"] Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.510371 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.525048 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd"] Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.655115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.655541 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.655573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvhv8\" (UniqueName: \"kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.718226 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt"] Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.756720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.756779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.756814 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvhv8\" (UniqueName: \"kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.757296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.757332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.775249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvhv8\" (UniqueName: \"kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.823765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.888189 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerStarted","Data":"837374e0800fbfeabc43d0f994d603b2967e488fd25e35cf53022c246c2a247e"} Mar 17 00:35:43 crc kubenswrapper[4755]: I0317 00:35:43.888233 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerStarted","Data":"16566f3a9187c7942a9bbf5999b7b7bc4400ba7b7bd14b32c893c505b733c94c"} Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.232537 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd"] Mar 17 00:35:44 crc kubenswrapper[4755]: W0317 00:35:44.237599 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b52216_5e23_4dd1_8a3a_32973449c58c.slice/crio-dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04 WatchSource:0}: Error finding container dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04: Status 404 returned error can't find the container with id dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04 Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.896404 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerID="837374e0800fbfeabc43d0f994d603b2967e488fd25e35cf53022c246c2a247e" exitCode=0 Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.896494 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerDied","Data":"837374e0800fbfeabc43d0f994d603b2967e488fd25e35cf53022c246c2a247e"} Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.900577 4755 generic.go:334] "Generic (PLEG): container finished" podID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerID="4f73501533052c64d6e88a1d96a9b2959c7ade766d9ce299ff23b561912f4225" exitCode=0 Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.900626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" event={"ID":"44b52216-5e23-4dd1-8a3a-32973449c58c","Type":"ContainerDied","Data":"4f73501533052c64d6e88a1d96a9b2959c7ade766d9ce299ff23b561912f4225"} Mar 17 00:35:44 crc kubenswrapper[4755]: I0317 00:35:44.900659 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" event={"ID":"44b52216-5e23-4dd1-8a3a-32973449c58c","Type":"ContainerStarted","Data":"dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04"} Mar 17 00:35:46 crc kubenswrapper[4755]: I0317 00:35:46.915540 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerID="7498c5109a697f42fadbab50d68aba94e11e3735024fa9db68ab30e904cc31c2" exitCode=0 Mar 17 00:35:46 crc kubenswrapper[4755]: I0317 00:35:46.915768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerDied","Data":"7498c5109a697f42fadbab50d68aba94e11e3735024fa9db68ab30e904cc31c2"} Mar 17 00:35:46 crc kubenswrapper[4755]: I0317 00:35:46.921742 4755 generic.go:334] "Generic (PLEG): container finished" podID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerID="bd6980f8417e9cb251b73a36e4290281a7301bbb9b0ca8659fedd7ec40590e57" exitCode=0 Mar 17 00:35:46 crc kubenswrapper[4755]: I0317 00:35:46.922002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" event={"ID":"44b52216-5e23-4dd1-8a3a-32973449c58c","Type":"ContainerDied","Data":"bd6980f8417e9cb251b73a36e4290281a7301bbb9b0ca8659fedd7ec40590e57"} Mar 17 00:35:47 crc kubenswrapper[4755]: I0317 00:35:47.935641 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerID="14c44a93c63bd83c1869655d2b6fce039f0f4a1287893f22c4fb453d582ed0ea" exitCode=0 Mar 17 00:35:47 crc kubenswrapper[4755]: I0317 00:35:47.935688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerDied","Data":"14c44a93c63bd83c1869655d2b6fce039f0f4a1287893f22c4fb453d582ed0ea"} Mar 17 00:35:47 crc kubenswrapper[4755]: I0317 00:35:47.939315 4755 generic.go:334] "Generic (PLEG): container finished" podID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerID="9b5a3beca2e2768679ddeff992e34c364f7075774691a8f1b246282e67a20b98" exitCode=0 Mar 17 00:35:47 crc kubenswrapper[4755]: I0317 00:35:47.939391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" event={"ID":"44b52216-5e23-4dd1-8a3a-32973449c58c","Type":"ContainerDied","Data":"9b5a3beca2e2768679ddeff992e34c364f7075774691a8f1b246282e67a20b98"} Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.268625 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.273041 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util\") pod \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449522 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwrsf\" (UniqueName: \"kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf\") pod \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449619 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util\") pod \"44b52216-5e23-4dd1-8a3a-32973449c58c\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449647 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle\") pod \"44b52216-5e23-4dd1-8a3a-32973449c58c\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449711 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvhv8\" (UniqueName: \"kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8\") pod \"44b52216-5e23-4dd1-8a3a-32973449c58c\" (UID: \"44b52216-5e23-4dd1-8a3a-32973449c58c\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.449755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle\") pod \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\" (UID: \"fb6ad00e-5b17-4cb6-898f-278fc16e8f31\") " Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.450925 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle" (OuterVolumeSpecName: "bundle") pod "fb6ad00e-5b17-4cb6-898f-278fc16e8f31" (UID: "fb6ad00e-5b17-4cb6-898f-278fc16e8f31"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.451794 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle" (OuterVolumeSpecName: "bundle") pod "44b52216-5e23-4dd1-8a3a-32973449c58c" (UID: "44b52216-5e23-4dd1-8a3a-32973449c58c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.457143 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf" (OuterVolumeSpecName: "kube-api-access-hwrsf") pod "fb6ad00e-5b17-4cb6-898f-278fc16e8f31" (UID: "fb6ad00e-5b17-4cb6-898f-278fc16e8f31"). InnerVolumeSpecName "kube-api-access-hwrsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.463844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8" (OuterVolumeSpecName: "kube-api-access-xvhv8") pod "44b52216-5e23-4dd1-8a3a-32973449c58c" (UID: "44b52216-5e23-4dd1-8a3a-32973449c58c"). InnerVolumeSpecName "kube-api-access-xvhv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.472920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util" (OuterVolumeSpecName: "util") pod "44b52216-5e23-4dd1-8a3a-32973449c58c" (UID: "44b52216-5e23-4dd1-8a3a-32973449c58c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.529166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util" (OuterVolumeSpecName: "util") pod "fb6ad00e-5b17-4cb6-898f-278fc16e8f31" (UID: "fb6ad00e-5b17-4cb6-898f-278fc16e8f31"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551142 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551184 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b52216-5e23-4dd1-8a3a-32973449c58c-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvhv8\" (UniqueName: \"kubernetes.io/projected/44b52216-5e23-4dd1-8a3a-32973449c58c-kube-api-access-xvhv8\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551209 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551221 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.551232 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwrsf\" (UniqueName: \"kubernetes.io/projected/fb6ad00e-5b17-4cb6-898f-278fc16e8f31-kube-api-access-hwrsf\") on node \"crc\" DevicePath \"\"" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.954192 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" event={"ID":"fb6ad00e-5b17-4cb6-898f-278fc16e8f31","Type":"ContainerDied","Data":"16566f3a9187c7942a9bbf5999b7b7bc4400ba7b7bd14b32c893c505b733c94c"} Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.954237 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16566f3a9187c7942a9bbf5999b7b7bc4400ba7b7bd14b32c893c505b733c94c" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.954265 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.956844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" event={"ID":"44b52216-5e23-4dd1-8a3a-32973449c58c","Type":"ContainerDied","Data":"dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04"} Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.956872 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbb40c7bdcd75cbf80ac02a7e94d87ea986a42408089ebaff6b2a52f7a61d04" Mar 17 00:35:49 crc kubenswrapper[4755]: I0317 00:35:49.956941 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd" Mar 17 00:35:58 crc kubenswrapper[4755]: I0317 00:35:58.665330 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:35:58 crc kubenswrapper[4755]: I0317 00:35:58.665785 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159587 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w"] Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159838 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="util" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159856 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="util" Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159865 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="pull" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159870 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="pull" Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159885 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159890 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159900 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="pull" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159905 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="pull" Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="util" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159924 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="util" Mar 17 00:35:59 crc kubenswrapper[4755]: E0317 00:35:59.159932 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.159937 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.160034 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6ad00e-5b17-4cb6-898f-278fc16e8f31" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.160048 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b52216-5e23-4dd1-8a3a-32973449c58c" containerName="extract" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.160595 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166276 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166338 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166392 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-2fmc7" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166409 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166597 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.166864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.176705 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w"] Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.239675 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.239770 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-manager-config\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.239819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhm2\" (UniqueName: \"kubernetes.io/projected/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-kube-api-access-xwhm2\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.239877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-apiservice-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.239893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-webhook-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.340917 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-manager-config\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.340964 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhm2\" (UniqueName: \"kubernetes.io/projected/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-kube-api-access-xwhm2\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.340994 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-apiservice-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.341010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-webhook-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.341078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.342688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-manager-config\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.350671 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-apiservice-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.350737 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.351252 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-webhook-cert\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.369376 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhm2\" (UniqueName: \"kubernetes.io/projected/9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a-kube-api-access-xwhm2\") pod \"loki-operator-controller-manager-7c87b9bff5-zjj4w\" (UID: \"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.476544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:35:59 crc kubenswrapper[4755]: I0317 00:35:59.686723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w"] Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.034760 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" event={"ID":"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a","Type":"ContainerStarted","Data":"4e7d4f144ba9946057af86e15dc2532f66261c0cf12a4c529af388d8d048e5d2"} Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.127886 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561796-jpjbh"] Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.129357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.132700 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.132754 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.135998 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.140142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561796-jpjbh"] Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.251376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbvf\" (UniqueName: \"kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf\") pod \"auto-csr-approver-29561796-jpjbh\" (UID: \"31c22c06-348a-44e8-9c8b-0e995aa82739\") " pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.353038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbvf\" (UniqueName: \"kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf\") pod \"auto-csr-approver-29561796-jpjbh\" (UID: \"31c22c06-348a-44e8-9c8b-0e995aa82739\") " pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.373987 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbvf\" (UniqueName: \"kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf\") pod \"auto-csr-approver-29561796-jpjbh\" (UID: \"31c22c06-348a-44e8-9c8b-0e995aa82739\") " pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.466484 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:00 crc kubenswrapper[4755]: I0317 00:36:00.695729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561796-jpjbh"] Mar 17 00:36:00 crc kubenswrapper[4755]: W0317 00:36:00.700702 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31c22c06_348a_44e8_9c8b_0e995aa82739.slice/crio-d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a WatchSource:0}: Error finding container d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a: Status 404 returned error can't find the container with id d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a Mar 17 00:36:01 crc kubenswrapper[4755]: I0317 00:36:01.048221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" event={"ID":"31c22c06-348a-44e8-9c8b-0e995aa82739","Type":"ContainerStarted","Data":"d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a"} Mar 17 00:36:02 crc kubenswrapper[4755]: I0317 00:36:02.055034 4755 generic.go:334] "Generic (PLEG): container finished" podID="31c22c06-348a-44e8-9c8b-0e995aa82739" containerID="1005d371d254a562b58cfa0a91ef63c86197b144c9a8107e560d9391cbffdf93" exitCode=0 Mar 17 00:36:02 crc kubenswrapper[4755]: I0317 00:36:02.055292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" event={"ID":"31c22c06-348a-44e8-9c8b-0e995aa82739","Type":"ContainerDied","Data":"1005d371d254a562b58cfa0a91ef63c86197b144c9a8107e560d9391cbffdf93"} Mar 17 00:36:03 crc kubenswrapper[4755]: I0317 00:36:03.629497 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:03 crc kubenswrapper[4755]: I0317 00:36:03.792654 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbvf\" (UniqueName: \"kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf\") pod \"31c22c06-348a-44e8-9c8b-0e995aa82739\" (UID: \"31c22c06-348a-44e8-9c8b-0e995aa82739\") " Mar 17 00:36:03 crc kubenswrapper[4755]: I0317 00:36:03.798229 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf" (OuterVolumeSpecName: "kube-api-access-flbvf") pod "31c22c06-348a-44e8-9c8b-0e995aa82739" (UID: "31c22c06-348a-44e8-9c8b-0e995aa82739"). InnerVolumeSpecName "kube-api-access-flbvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:36:03 crc kubenswrapper[4755]: I0317 00:36:03.893966 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbvf\" (UniqueName: \"kubernetes.io/projected/31c22c06-348a-44e8-9c8b-0e995aa82739-kube-api-access-flbvf\") on node \"crc\" DevicePath \"\"" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.031841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt"] Mar 17 00:36:04 crc kubenswrapper[4755]: E0317 00:36:04.032154 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c22c06-348a-44e8-9c8b-0e995aa82739" containerName="oc" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.032178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c22c06-348a-44e8-9c8b-0e995aa82739" containerName="oc" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.033189 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c22c06-348a-44e8-9c8b-0e995aa82739" containerName="oc" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.033993 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.036056 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.036697 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.037059 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-kgw4s" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.046952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt"] Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.071570 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" event={"ID":"31c22c06-348a-44e8-9c8b-0e995aa82739","Type":"ContainerDied","Data":"d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a"} Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.071618 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bed42c833121b642eb0fc40b00370dd52e253ef9ddbb6b097a85de89d4319a" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.071691 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561796-jpjbh" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.095693 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f792c\" (UniqueName: \"kubernetes.io/projected/c40a3b37-e723-4031-be06-728785655b37-kube-api-access-f792c\") pod \"cluster-logging-operator-66689c4bbf-vb7pt\" (UID: \"c40a3b37-e723-4031-be06-728785655b37\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.196513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f792c\" (UniqueName: \"kubernetes.io/projected/c40a3b37-e723-4031-be06-728785655b37-kube-api-access-f792c\") pod \"cluster-logging-operator-66689c4bbf-vb7pt\" (UID: \"c40a3b37-e723-4031-be06-728785655b37\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.235843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f792c\" (UniqueName: \"kubernetes.io/projected/c40a3b37-e723-4031-be06-728785655b37-kube-api-access-f792c\") pod \"cluster-logging-operator-66689c4bbf-vb7pt\" (UID: \"c40a3b37-e723-4031-be06-728785655b37\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.367918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.706623 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt"] Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.726122 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561790-lhmb8"] Mar 17 00:36:04 crc kubenswrapper[4755]: I0317 00:36:04.731487 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561790-lhmb8"] Mar 17 00:36:04 crc kubenswrapper[4755]: W0317 00:36:04.742591 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40a3b37_e723_4031_be06_728785655b37.slice/crio-614e6a5b40a79f7e75e8945415e5548b76f7dbdb024b3bcdbe3fff09136338db WatchSource:0}: Error finding container 614e6a5b40a79f7e75e8945415e5548b76f7dbdb024b3bcdbe3fff09136338db: Status 404 returned error can't find the container with id 614e6a5b40a79f7e75e8945415e5548b76f7dbdb024b3bcdbe3fff09136338db Mar 17 00:36:05 crc kubenswrapper[4755]: I0317 00:36:05.077461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" event={"ID":"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a","Type":"ContainerStarted","Data":"2da05703be5d0eff75ee011712eb326c25f330eca419b750d179e0c899635d11"} Mar 17 00:36:05 crc kubenswrapper[4755]: I0317 00:36:05.078686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" event={"ID":"c40a3b37-e723-4031-be06-728785655b37","Type":"ContainerStarted","Data":"614e6a5b40a79f7e75e8945415e5548b76f7dbdb024b3bcdbe3fff09136338db"} Mar 17 00:36:06 crc kubenswrapper[4755]: I0317 00:36:06.259596 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fadb568f-9e01-4202-ba32-afd95b7b1328" path="/var/lib/kubelet/pods/fadb568f-9e01-4202-ba32-afd95b7b1328/volumes" Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.864423 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.866292 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.900758 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.942658 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.942737 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnvb\" (UniqueName: \"kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:07 crc kubenswrapper[4755]: I0317 00:36:07.942818 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.044099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnvb\" (UniqueName: \"kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.044174 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.044213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.044828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.044851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.070258 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnvb\" (UniqueName: \"kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb\") pod \"redhat-marketplace-gsw84\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:08 crc kubenswrapper[4755]: I0317 00:36:08.238164 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:14 crc kubenswrapper[4755]: I0317 00:36:14.390892 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.153648 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" event={"ID":"c40a3b37-e723-4031-be06-728785655b37","Type":"ContainerStarted","Data":"44c8e1a9274e5af01ac61aeb4823b80909e05acb4d483051339d5598509dfaea"} Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.155687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" event={"ID":"9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a","Type":"ContainerStarted","Data":"5dc2c71b9c046f83448066d081aa63e5e9821b546931fa64e3594b534473e547"} Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.155909 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.157805 4755 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerID="09ebe436776009c68205d0a8835e463d3d16480f970a492e836fdac0dd8a7425" exitCode=0 Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.157859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerDied","Data":"09ebe436776009c68205d0a8835e463d3d16480f970a492e836fdac0dd8a7425"} Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.157890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerStarted","Data":"1c52355a1c71c8988d422c6d588390cd9a470a2945622e56524ef7eb9863b1b0"} Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.158681 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.196014 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vb7pt" podStartSLOduration=1.700087671 podStartE2EDuration="11.195994951s" podCreationTimestamp="2026-03-17 00:36:04 +0000 UTC" firstStartedPulling="2026-03-17 00:36:04.744914492 +0000 UTC m=+839.504366775" lastFinishedPulling="2026-03-17 00:36:14.240821772 +0000 UTC m=+849.000274055" observedRunningTime="2026-03-17 00:36:15.192071036 +0000 UTC m=+849.951523319" watchObservedRunningTime="2026-03-17 00:36:15.195994951 +0000 UTC m=+849.955447244" Mar 17 00:36:15 crc kubenswrapper[4755]: I0317 00:36:15.263657 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7c87b9bff5-zjj4w" podStartSLOduration=1.688939652 podStartE2EDuration="16.263640598s" podCreationTimestamp="2026-03-17 00:35:59 +0000 UTC" firstStartedPulling="2026-03-17 00:35:59.698058215 +0000 UTC m=+834.457510498" lastFinishedPulling="2026-03-17 00:36:14.272759161 +0000 UTC m=+849.032211444" observedRunningTime="2026-03-17 00:36:15.260767841 +0000 UTC m=+850.020220124" watchObservedRunningTime="2026-03-17 00:36:15.263640598 +0000 UTC m=+850.023092881" Mar 17 00:36:16 crc kubenswrapper[4755]: I0317 00:36:16.168322 4755 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerID="b1dc81105fd2dbf4c8aaf771a633bc78769793ad6c35f8b0d3aa8411b7d1bd24" exitCode=0 Mar 17 00:36:16 crc kubenswrapper[4755]: I0317 00:36:16.168542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerDied","Data":"b1dc81105fd2dbf4c8aaf771a633bc78769793ad6c35f8b0d3aa8411b7d1bd24"} Mar 17 00:36:17 crc kubenswrapper[4755]: I0317 00:36:17.175787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerStarted","Data":"9f8f30e636876c6f33f1c8f9ebaccbee890a1744ada39a69765371f2a965d45e"} Mar 17 00:36:17 crc kubenswrapper[4755]: I0317 00:36:17.192211 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsw84" podStartSLOduration=8.777909311 podStartE2EDuration="10.192197416s" podCreationTimestamp="2026-03-17 00:36:07 +0000 UTC" firstStartedPulling="2026-03-17 00:36:15.159764208 +0000 UTC m=+849.919216481" lastFinishedPulling="2026-03-17 00:36:16.574052303 +0000 UTC m=+851.333504586" observedRunningTime="2026-03-17 00:36:17.190777348 +0000 UTC m=+851.950229641" watchObservedRunningTime="2026-03-17 00:36:17.192197416 +0000 UTC m=+851.951649699" Mar 17 00:36:18 crc kubenswrapper[4755]: I0317 00:36:18.239549 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:18 crc kubenswrapper[4755]: I0317 00:36:18.241572 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.101992 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.102784 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.105031 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.105169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-6z7x8" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.105529 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.117796 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.192587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6v5\" (UniqueName: \"kubernetes.io/projected/0e950ea2-cafa-4d54-a4b7-136fe7f137d0-kube-api-access-zb6v5\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.192672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.293531 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.294364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6v5\" (UniqueName: \"kubernetes.io/projected/0e950ea2-cafa-4d54-a4b7-136fe7f137d0-kube-api-access-zb6v5\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.296790 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.296835 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/05fc5cc39507ab501e86da568e2e982e29cf48fa869e52611a794566d0b81923/globalmount\"" pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.297549 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gsw84" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="registry-server" probeResult="failure" output=< Mar 17 00:36:19 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:36:19 crc kubenswrapper[4755]: > Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.310630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6v5\" (UniqueName: \"kubernetes.io/projected/0e950ea2-cafa-4d54-a4b7-136fe7f137d0-kube-api-access-zb6v5\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.321997 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2eb60fb9-2b2a-4965-801c-b55e4ff0d886\") pod \"minio\" (UID: \"0e950ea2-cafa-4d54-a4b7-136fe7f137d0\") " pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.481365 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 17 00:36:19 crc kubenswrapper[4755]: I0317 00:36:19.737536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 17 00:36:20 crc kubenswrapper[4755]: I0317 00:36:20.211230 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0e950ea2-cafa-4d54-a4b7-136fe7f137d0","Type":"ContainerStarted","Data":"9c313fe62d8c665f8e846aa6be868e04fcec84e638f74fb7d4b6f3a44771690e"} Mar 17 00:36:22 crc kubenswrapper[4755]: I0317 00:36:22.910750 4755 scope.go:117] "RemoveContainer" containerID="138b1e1cf02623e0bee47e61a2c1af0eef5fe4c9f421b4a1256fb2c7d5e16b4c" Mar 17 00:36:23 crc kubenswrapper[4755]: I0317 00:36:23.232779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0e950ea2-cafa-4d54-a4b7-136fe7f137d0","Type":"ContainerStarted","Data":"416aab41477d333788c5ac55e2abfd53c5641a085730256d256545a4ac4fb0dc"} Mar 17 00:36:23 crc kubenswrapper[4755]: I0317 00:36:23.255508 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.023422688 podStartE2EDuration="7.25548476s" podCreationTimestamp="2026-03-17 00:36:16 +0000 UTC" firstStartedPulling="2026-03-17 00:36:19.74877888 +0000 UTC m=+854.508231183" lastFinishedPulling="2026-03-17 00:36:22.980840962 +0000 UTC m=+857.740293255" observedRunningTime="2026-03-17 00:36:23.25120252 +0000 UTC m=+858.010654813" watchObservedRunningTime="2026-03-17 00:36:23.25548476 +0000 UTC m=+858.014937053" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.487908 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-crzs4"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.489684 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.494921 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-2vcvg" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.495160 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.495326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.495455 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.496044 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.529479 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-crzs4"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.643185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.643285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-config\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.643326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/c413d841-c2b9-4757-bbe4-ebd965553d29-kube-api-access-k7ln2\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.643348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.643364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.644954 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.646006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.649342 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.650011 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.650181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.661235 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.720864 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.721562 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.726201 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.726304 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.731796 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746259 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746313 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-config\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746363 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-config\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nhfr\" (UniqueName: \"kubernetes.io/projected/85ac7711-fd0b-4598-93dd-6c591a532bac-kube-api-access-2nhfr\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/c413d841-c2b9-4757-bbe4-ebd965553d29-kube-api-access-k7ln2\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746447 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746485 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.746500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.748199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-config\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.748210 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.755679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.758073 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/c413d841-c2b9-4757-bbe4-ebd965553d29-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.807657 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/c413d841-c2b9-4757-bbe4-ebd965553d29-kube-api-access-k7ln2\") pod \"logging-loki-distributor-9c6b6d984-crzs4\" (UID: \"c413d841-c2b9-4757-bbe4-ebd965553d29\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847206 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-config\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-config\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847278 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nhfr\" (UniqueName: \"kubernetes.io/projected/85ac7711-fd0b-4598-93dd-6c591a532bac-kube-api-access-2nhfr\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847345 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.847421 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpgv5\" (UniqueName: \"kubernetes.io/projected/296e81ca-7bf1-44f2-b1a8-bfb13a563134-kube-api-access-jpgv5\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.850867 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.854689 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.857346 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ac7711-fd0b-4598-93dd-6c591a532bac-config\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.867087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.870072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/85ac7711-fd0b-4598-93dd-6c591a532bac-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.913177 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.933906 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.934016 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.939880 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.940111 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.940344 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.940481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.940679 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-v5jb2" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.941303 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.949523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpgv5\" (UniqueName: \"kubernetes.io/projected/296e81ca-7bf1-44f2-b1a8-bfb13a563134-kube-api-access-jpgv5\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.949593 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-config\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.949646 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.949670 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.949691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.976413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-config\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.976685 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nhfr\" (UniqueName: \"kubernetes.io/projected/85ac7711-fd0b-4598-93dd-6c591a532bac-kube-api-access-2nhfr\") pod \"logging-loki-querier-6dcbdf8bb8-bh66s\" (UID: \"85ac7711-fd0b-4598-93dd-6c591a532bac\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.976800 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.977912 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.981586 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.989707 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48"] Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.991920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:27 crc kubenswrapper[4755]: I0317 00:36:27.994009 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/296e81ca-7bf1-44f2-b1a8-bfb13a563134-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.003093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpgv5\" (UniqueName: \"kubernetes.io/projected/296e81ca-7bf1-44f2-b1a8-bfb13a563134-kube-api-access-jpgv5\") pod \"logging-loki-query-frontend-ff66c4dc9-95f86\" (UID: \"296e81ca-7bf1-44f2-b1a8-bfb13a563134\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.040721 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.076296 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.076571 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7v7\" (UniqueName: \"kubernetes.io/projected/ad7de1cc-717a-4e3e-81f7-43c677c2db13-kube-api-access-pk7v7\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.076723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.076821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.076920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077061 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077154 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2xcp\" (UniqueName: \"kubernetes.io/projected/c1fe9206-5f28-4707-b175-12ba0fadb400-kube-api-access-m2xcp\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077400 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.077415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.107408 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.178507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2xcp\" (UniqueName: \"kubernetes.io/projected/c1fe9206-5f28-4707-b175-12ba0fadb400-kube-api-access-m2xcp\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.178545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.178575 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179448 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179505 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179579 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179596 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7v7\" (UniqueName: \"kubernetes.io/projected/ad7de1cc-717a-4e3e-81f7-43c677c2db13-kube-api-access-pk7v7\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179618 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179655 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.179959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-lokistack-gateway\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: E0317 00:36:28.180055 4755 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 17 00:36:28 crc kubenswrapper[4755]: E0317 00:36:28.180101 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret podName:ad7de1cc-717a-4e3e-81f7-43c677c2db13 nodeName:}" failed. No retries permitted until 2026-03-17 00:36:28.680083414 +0000 UTC m=+863.439535697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret") pod "logging-loki-gateway-cdf4b6b4d-f2g7b" (UID: "ad7de1cc-717a-4e3e-81f7-43c677c2db13") : secret "logging-loki-gateway-http" not found Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.180325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.180801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: E0317 00:36:28.180890 4755 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 17 00:36:28 crc kubenswrapper[4755]: E0317 00:36:28.180940 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret podName:c1fe9206-5f28-4707-b175-12ba0fadb400 nodeName:}" failed. No retries permitted until 2026-03-17 00:36:28.680922378 +0000 UTC m=+863.440374721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret") pod "logging-loki-gateway-cdf4b6b4d-jfm48" (UID: "c1fe9206-5f28-4707-b175-12ba0fadb400") : secret "logging-loki-gateway-http" not found Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.181401 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.181736 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.182880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c1fe9206-5f28-4707-b175-12ba0fadb400-rbac\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.184749 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.185179 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tenants\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.186144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.187370 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.189711 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7de1cc-717a-4e3e-81f7-43c677c2db13-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.200013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2xcp\" (UniqueName: \"kubernetes.io/projected/c1fe9206-5f28-4707-b175-12ba0fadb400-kube-api-access-m2xcp\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.213126 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7v7\" (UniqueName: \"kubernetes.io/projected/ad7de1cc-717a-4e3e-81f7-43c677c2db13-kube-api-access-pk7v7\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.267928 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.284799 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.349019 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.390878 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-crzs4"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.499788 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.665654 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.665737 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.692301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.692342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.696578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c1fe9206-5f28-4707-b175-12ba0fadb400-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-jfm48\" (UID: \"c1fe9206-5f28-4707-b175-12ba0fadb400\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.698382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/ad7de1cc-717a-4e3e-81f7-43c677c2db13-tls-secret\") pod \"logging-loki-gateway-cdf4b6b4d-f2g7b\" (UID: \"ad7de1cc-717a-4e3e-81f7-43c677c2db13\") " pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.701832 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.702751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.705527 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.705690 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.711405 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.718940 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.720121 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.722829 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.723056 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.746334 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.757263 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s"] Mar 17 00:36:28 crc kubenswrapper[4755]: W0317 00:36:28.771230 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ac7711_fd0b_4598_93dd_6c591a532bac.slice/crio-75d5ef8822f6ceb07e6eb5065fc0d24bbf73ec7cb3789452c1b7361dde9be297 WatchSource:0}: Error finding container 75d5ef8822f6ceb07e6eb5065fc0d24bbf73ec7cb3789452c1b7361dde9be297: Status 404 returned error can't find the container with id 75d5ef8822f6ceb07e6eb5065fc0d24bbf73ec7cb3789452c1b7361dde9be297 Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.795516 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.796582 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.799200 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.799639 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.812898 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894301 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-788b6146-999e-4274-9813-aca621fda47b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-788b6146-999e-4274-9813-aca621fda47b\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894701 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894848 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32ca32d2-2771-485d-84c4-773e6fccb533\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32ca32d2-2771-485d-84c4-773e6fccb533\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.894982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtx9\" (UniqueName: \"kubernetes.io/projected/c516523b-4c3b-4083-a8f5-18c9061c7032-kube-api-access-bvtx9\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895156 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895337 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895388 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-config\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkfg\" (UniqueName: \"kubernetes.io/projected/66514db5-2205-445e-b424-b55fb9910be3-kube-api-access-ctkfg\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895634 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdhd\" (UniqueName: \"kubernetes.io/projected/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-kube-api-access-6qdhd\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895757 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895785 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-config\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.895830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.897937 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.913856 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.996819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.997139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.997160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.997784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.997828 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-config\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.997853 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.998716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-config\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.998794 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkfg\" (UniqueName: \"kubernetes.io/projected/66514db5-2205-445e-b424-b55fb9910be3-kube-api-access-ctkfg\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.998813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.999185 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.999209 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdhd\" (UniqueName: \"kubernetes.io/projected/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-kube-api-access-6qdhd\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:28 crc kubenswrapper[4755]: I0317 00:36:28.999249 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000410 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:28.999273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-config\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000551 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000571 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000596 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-788b6146-999e-4274-9813-aca621fda47b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-788b6146-999e-4274-9813-aca621fda47b\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000775 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000915 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32ca32d2-2771-485d-84c4-773e6fccb533\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32ca32d2-2771-485d-84c4-773e6fccb533\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.000932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtx9\" (UniqueName: \"kubernetes.io/projected/c516523b-4c3b-4083-a8f5-18c9061c7032-kube-api-access-bvtx9\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.002275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.003043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.003179 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c516523b-4c3b-4083-a8f5-18c9061c7032-config\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.004008 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.004027 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84c269fb6e5e74e8d53ffa64a6e263a25d1312f2bb74b2d14e8f86e686ceb856/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.004832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.004979 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.005062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.006057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.010375 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.010838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.011037 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/66514db5-2205-445e-b424-b55fb9910be3-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.011325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.011663 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.011869 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.011888 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4fd592df2d7ea65d9bc01c3c3c221bd4cfeceea4e776d73388efcddc387ce1cf/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.012290 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.012314 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32ca32d2-2771-485d-84c4-773e6fccb533\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32ca32d2-2771-485d-84c4-773e6fccb533\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d7a02b383dcf924c38c10f8b5ef4b97f00bc5ad172f27a54ba33d3f27739bf88/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.012721 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.012744 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-788b6146-999e-4274-9813-aca621fda47b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-788b6146-999e-4274-9813-aca621fda47b\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a0e53d1f79723d8ec6bb9c7539f296db86a3b411e323c4210c8ddc613671e00b/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.016729 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c516523b-4c3b-4083-a8f5-18c9061c7032-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.017631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtx9\" (UniqueName: \"kubernetes.io/projected/c516523b-4c3b-4083-a8f5-18c9061c7032-kube-api-access-bvtx9\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.019946 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkfg\" (UniqueName: \"kubernetes.io/projected/66514db5-2205-445e-b424-b55fb9910be3-kube-api-access-ctkfg\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.042534 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdhd\" (UniqueName: \"kubernetes.io/projected/93847e12-81c9-4ae4-8090-e7df4bd5f9a7-kube-api-access-6qdhd\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.051126 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-788b6146-999e-4274-9813-aca621fda47b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-788b6146-999e-4274-9813-aca621fda47b\") pod \"logging-loki-index-gateway-0\" (UID: \"93847e12-81c9-4ae4-8090-e7df4bd5f9a7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.055932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7abb538b-c7f4-4c89-a944-727ca2f3d599\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.065641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-391a8706-cdcd-4180-b8fa-cda542784ce6\") pod \"logging-loki-compactor-0\" (UID: \"66514db5-2205-445e-b424-b55fb9910be3\") " pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.066201 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32ca32d2-2771-485d-84c4-773e6fccb533\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32ca32d2-2771-485d-84c4-773e6fccb533\") pod \"logging-loki-ingester-0\" (UID: \"c516523b-4c3b-4083-a8f5-18c9061c7032\") " pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.079193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.132213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.220737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b"] Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.282699 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48"] Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.319854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" event={"ID":"ad7de1cc-717a-4e3e-81f7-43c677c2db13","Type":"ContainerStarted","Data":"62d12a89752c7ec08b6b58277fc55a7611374d2e326dfcc49110ca8b611d6e61"} Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.321107 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" event={"ID":"85ac7711-fd0b-4598-93dd-6c591a532bac","Type":"ContainerStarted","Data":"75d5ef8822f6ceb07e6eb5065fc0d24bbf73ec7cb3789452c1b7361dde9be297"} Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.322270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" event={"ID":"296e81ca-7bf1-44f2-b1a8-bfb13a563134","Type":"ContainerStarted","Data":"b473aed20a8d80af8037b1dfa0962adcb36981d29ab8e9f0e356d1c988615df4"} Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.323415 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" event={"ID":"c413d841-c2b9-4757-bbe4-ebd965553d29","Type":"ContainerStarted","Data":"aa2ffc1394c8343a552fdfcb064a4f701d216288d09cd79a497c7d98ebdb5ccb"} Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.325097 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.325464 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" event={"ID":"c1fe9206-5f28-4707-b175-12ba0fadb400","Type":"ContainerStarted","Data":"1666aad1e20c13e24b7d4d41f79a28fbb1780a4da3d06b6524d7909e57c8e3fc"} Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.409178 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 17 00:36:29 crc kubenswrapper[4755]: W0317 00:36:29.421808 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93847e12_81c9_4ae4_8090_e7df4bd5f9a7.slice/crio-23f4160a04e57688552bcb988d4f484e9d9f6357da3775e2e2f11b1053d2aa2c WatchSource:0}: Error finding container 23f4160a04e57688552bcb988d4f484e9d9f6357da3775e2e2f11b1053d2aa2c: Status 404 returned error can't find the container with id 23f4160a04e57688552bcb988d4f484e9d9f6357da3775e2e2f11b1053d2aa2c Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.540591 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 17 00:36:29 crc kubenswrapper[4755]: W0317 00:36:29.541604 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66514db5_2205_445e_b424_b55fb9910be3.slice/crio-f345af6f855f852394582ac5d014eef822d06a46d6266824f8640b7e30983832 WatchSource:0}: Error finding container f345af6f855f852394582ac5d014eef822d06a46d6266824f8640b7e30983832: Status 404 returned error can't find the container with id f345af6f855f852394582ac5d014eef822d06a46d6266824f8640b7e30983832 Mar 17 00:36:29 crc kubenswrapper[4755]: I0317 00:36:29.544567 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 17 00:36:30 crc kubenswrapper[4755]: I0317 00:36:30.333973 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"66514db5-2205-445e-b424-b55fb9910be3","Type":"ContainerStarted","Data":"f345af6f855f852394582ac5d014eef822d06a46d6266824f8640b7e30983832"} Mar 17 00:36:30 crc kubenswrapper[4755]: I0317 00:36:30.335689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"93847e12-81c9-4ae4-8090-e7df4bd5f9a7","Type":"ContainerStarted","Data":"23f4160a04e57688552bcb988d4f484e9d9f6357da3775e2e2f11b1053d2aa2c"} Mar 17 00:36:30 crc kubenswrapper[4755]: I0317 00:36:30.337178 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c516523b-4c3b-4083-a8f5-18c9061c7032","Type":"ContainerStarted","Data":"814a20f907f9c4c9c8eebcb62527918b7f9a9881883cad3e5b721b5f9b651215"} Mar 17 00:36:30 crc kubenswrapper[4755]: I0317 00:36:30.657931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:30 crc kubenswrapper[4755]: I0317 00:36:30.658233 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gsw84" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="registry-server" containerID="cri-o://9f8f30e636876c6f33f1c8f9ebaccbee890a1744ada39a69765371f2a965d45e" gracePeriod=2 Mar 17 00:36:31 crc kubenswrapper[4755]: I0317 00:36:31.344363 4755 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerID="9f8f30e636876c6f33f1c8f9ebaccbee890a1744ada39a69765371f2a965d45e" exitCode=0 Mar 17 00:36:31 crc kubenswrapper[4755]: I0317 00:36:31.344403 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerDied","Data":"9f8f30e636876c6f33f1c8f9ebaccbee890a1744ada39a69765371f2a965d45e"} Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.587513 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.653663 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dnvb\" (UniqueName: \"kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb\") pod \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.653835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities\") pod \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.653855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content\") pod \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\" (UID: \"8eb2c1ca-d726-45ce-bae8-97198fb3e43a\") " Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.655070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities" (OuterVolumeSpecName: "utilities") pod "8eb2c1ca-d726-45ce-bae8-97198fb3e43a" (UID: "8eb2c1ca-d726-45ce-bae8-97198fb3e43a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.659305 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb" (OuterVolumeSpecName: "kube-api-access-2dnvb") pod "8eb2c1ca-d726-45ce-bae8-97198fb3e43a" (UID: "8eb2c1ca-d726-45ce-bae8-97198fb3e43a"). InnerVolumeSpecName "kube-api-access-2dnvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.677727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eb2c1ca-d726-45ce-bae8-97198fb3e43a" (UID: "8eb2c1ca-d726-45ce-bae8-97198fb3e43a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.755361 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dnvb\" (UniqueName: \"kubernetes.io/projected/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-kube-api-access-2dnvb\") on node \"crc\" DevicePath \"\"" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.755399 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:36:32 crc kubenswrapper[4755]: I0317 00:36:32.755414 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c1ca-d726-45ce-bae8-97198fb3e43a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.363712 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsw84" event={"ID":"8eb2c1ca-d726-45ce-bae8-97198fb3e43a","Type":"ContainerDied","Data":"1c52355a1c71c8988d422c6d588390cd9a470a2945622e56524ef7eb9863b1b0"} Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.363834 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsw84" Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.363995 4755 scope.go:117] "RemoveContainer" containerID="9f8f30e636876c6f33f1c8f9ebaccbee890a1744ada39a69765371f2a965d45e" Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.399306 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.406334 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsw84"] Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.420567 4755 scope.go:117] "RemoveContainer" containerID="b1dc81105fd2dbf4c8aaf771a633bc78769793ad6c35f8b0d3aa8411b7d1bd24" Mar 17 00:36:33 crc kubenswrapper[4755]: I0317 00:36:33.445524 4755 scope.go:117] "RemoveContainer" containerID="09ebe436776009c68205d0a8835e463d3d16480f970a492e836fdac0dd8a7425" Mar 17 00:36:34 crc kubenswrapper[4755]: I0317 00:36:34.254354 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" path="/var/lib/kubelet/pods/8eb2c1ca-d726-45ce-bae8-97198fb3e43a/volumes" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.388360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" event={"ID":"c413d841-c2b9-4757-bbe4-ebd965553d29","Type":"ContainerStarted","Data":"522bf0aea5f2ab6a2ed561c4d5e310f15a397f4de91ae5531b898febaf5aa20b"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.390781 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.391842 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" event={"ID":"296e81ca-7bf1-44f2-b1a8-bfb13a563134","Type":"ContainerStarted","Data":"aa967c703ca2cd443d1d91629091eff38001efe9c2af9e5cb92b70e2ede0e2c3"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.392501 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.397070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" event={"ID":"c1fe9206-5f28-4707-b175-12ba0fadb400","Type":"ContainerStarted","Data":"2e2bd5aa5765ccb9264b57a8de08beb87a69cc1c682e81d52a79c28e3453e8d0"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.401777 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" event={"ID":"ad7de1cc-717a-4e3e-81f7-43c677c2db13","Type":"ContainerStarted","Data":"48fe7cb9631cf848dfc8645457cad874dbd6b651b91dfdcae6f1647ed5807f7d"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.403112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"66514db5-2205-445e-b424-b55fb9910be3","Type":"ContainerStarted","Data":"cad1f684a6a67251d2c7a9ade0af0f6dc6c4f77d2f4eb6fb0cd65e819286a705"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.403365 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.405054 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" event={"ID":"85ac7711-fd0b-4598-93dd-6c591a532bac","Type":"ContainerStarted","Data":"6547b38e4bee62a9cb8438046dfb40bba9d90737c757cc38e9702e5cde26ae33"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.405170 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.406645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"93847e12-81c9-4ae4-8090-e7df4bd5f9a7","Type":"ContainerStarted","Data":"61f3100cee4bc9df8f206f95d930a8c476b658bc273b94939eaa9e45e5862aa6"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.406781 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.408360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c516523b-4c3b-4083-a8f5-18c9061c7032","Type":"ContainerStarted","Data":"6667549121d9c26adbd95d51b3a4f05a1a7960e44fe309f7ed50d2e5ab43d731"} Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.408488 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.439583 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.429925046 podStartE2EDuration="9.439565884s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:29.543543843 +0000 UTC m=+864.302996116" lastFinishedPulling="2026-03-17 00:36:35.553184671 +0000 UTC m=+870.312636954" observedRunningTime="2026-03-17 00:36:36.431938362 +0000 UTC m=+871.191390675" watchObservedRunningTime="2026-03-17 00:36:36.439565884 +0000 UTC m=+871.199018167" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.447532 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" podStartSLOduration=2.594253137 podStartE2EDuration="9.447515766s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:28.773894256 +0000 UTC m=+863.533346559" lastFinishedPulling="2026-03-17 00:36:35.627156905 +0000 UTC m=+870.386609188" observedRunningTime="2026-03-17 00:36:36.445988193 +0000 UTC m=+871.205440476" watchObservedRunningTime="2026-03-17 00:36:36.447515766 +0000 UTC m=+871.206968059" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.469742 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.284439236 podStartE2EDuration="9.469720994s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:29.423921166 +0000 UTC m=+864.183373449" lastFinishedPulling="2026-03-17 00:36:35.609202924 +0000 UTC m=+870.368655207" observedRunningTime="2026-03-17 00:36:36.464317965 +0000 UTC m=+871.223770388" watchObservedRunningTime="2026-03-17 00:36:36.469720994 +0000 UTC m=+871.229173277" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.487029 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" podStartSLOduration=2.297335166 podStartE2EDuration="9.487005547s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:28.422867385 +0000 UTC m=+863.182319668" lastFinishedPulling="2026-03-17 00:36:35.612537766 +0000 UTC m=+870.371990049" observedRunningTime="2026-03-17 00:36:36.478622173 +0000 UTC m=+871.238074476" watchObservedRunningTime="2026-03-17 00:36:36.487005547 +0000 UTC m=+871.246457850" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.499928 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" podStartSLOduration=2.406608363 podStartE2EDuration="9.499905047s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:28.512592798 +0000 UTC m=+863.272045081" lastFinishedPulling="2026-03-17 00:36:35.605889472 +0000 UTC m=+870.365341765" observedRunningTime="2026-03-17 00:36:36.496230644 +0000 UTC m=+871.255682957" watchObservedRunningTime="2026-03-17 00:36:36.499905047 +0000 UTC m=+871.259357330" Mar 17 00:36:36 crc kubenswrapper[4755]: I0317 00:36:36.526716 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.464702766 podStartE2EDuration="9.526695624s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:29.547825363 +0000 UTC m=+864.307277646" lastFinishedPulling="2026-03-17 00:36:35.609818221 +0000 UTC m=+870.369270504" observedRunningTime="2026-03-17 00:36:36.522097166 +0000 UTC m=+871.281549479" watchObservedRunningTime="2026-03-17 00:36:36.526695624 +0000 UTC m=+871.286147907" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.421326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" event={"ID":"ad7de1cc-717a-4e3e-81f7-43c677c2db13","Type":"ContainerStarted","Data":"7bfe740aebf616d366da3448db698dc0989dbfd3ef13620a3cbbaa600cdaa906"} Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.421689 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.424288 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" event={"ID":"c1fe9206-5f28-4707-b175-12ba0fadb400","Type":"ContainerStarted","Data":"3c7c88614d3e89220c7f518619c40757032d9ab4ca60a4d3f9bc97ecd4ce179b"} Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.424906 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.425030 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.430096 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.433329 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.436321 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.447654 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" podStartSLOduration=2.682539459 podStartE2EDuration="11.447637852s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:29.231710955 +0000 UTC m=+863.991163238" lastFinishedPulling="2026-03-17 00:36:37.996809348 +0000 UTC m=+872.756261631" observedRunningTime="2026-03-17 00:36:38.443609159 +0000 UTC m=+873.203061502" watchObservedRunningTime="2026-03-17 00:36:38.447637852 +0000 UTC m=+873.207090135" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.478132 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-jfm48" podStartSLOduration=2.775202504 podStartE2EDuration="11.478107072s" podCreationTimestamp="2026-03-17 00:36:27 +0000 UTC" firstStartedPulling="2026-03-17 00:36:29.290462274 +0000 UTC m=+864.049914557" lastFinishedPulling="2026-03-17 00:36:37.993366822 +0000 UTC m=+872.752819125" observedRunningTime="2026-03-17 00:36:38.467747312 +0000 UTC m=+873.227199615" watchObservedRunningTime="2026-03-17 00:36:38.478107072 +0000 UTC m=+873.237559345" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.898632 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:38 crc kubenswrapper[4755]: I0317 00:36:38.914644 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cdf4b6b4d-f2g7b" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.892592 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:36:47 crc kubenswrapper[4755]: E0317 00:36:47.895375 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="registry-server" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.895513 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="registry-server" Mar 17 00:36:47 crc kubenswrapper[4755]: E0317 00:36:47.895637 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="extract-content" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.895655 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="extract-content" Mar 17 00:36:47 crc kubenswrapper[4755]: E0317 00:36:47.895678 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="extract-utilities" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.895690 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="extract-utilities" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.895947 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2c1ca-d726-45ce-bae8-97198fb3e43a" containerName="registry-server" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.897803 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:47 crc kubenswrapper[4755]: I0317 00:36:47.915046 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.082025 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.082081 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.082125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgqj\" (UniqueName: \"kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.183805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.183866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgqj\" (UniqueName: \"kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.183967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.184452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.184489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.211160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgqj\" (UniqueName: \"kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj\") pod \"community-operators-xg2pg\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.221989 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:48 crc kubenswrapper[4755]: I0317 00:36:48.724547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:36:48 crc kubenswrapper[4755]: W0317 00:36:48.730194 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f8378b_28ae_4421_8927_d1715aad050c.slice/crio-f0713254414951ed0817e321d5ff75769120bcf5afd32de7f246a44961854aa5 WatchSource:0}: Error finding container f0713254414951ed0817e321d5ff75769120bcf5afd32de7f246a44961854aa5: Status 404 returned error can't find the container with id f0713254414951ed0817e321d5ff75769120bcf5afd32de7f246a44961854aa5 Mar 17 00:36:49 crc kubenswrapper[4755]: I0317 00:36:49.555529 4755 generic.go:334] "Generic (PLEG): container finished" podID="37f8378b-28ae-4421-8927-d1715aad050c" containerID="2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b" exitCode=0 Mar 17 00:36:49 crc kubenswrapper[4755]: I0317 00:36:49.555609 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerDied","Data":"2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b"} Mar 17 00:36:49 crc kubenswrapper[4755]: I0317 00:36:49.557343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerStarted","Data":"f0713254414951ed0817e321d5ff75769120bcf5afd32de7f246a44961854aa5"} Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.565077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerStarted","Data":"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72"} Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.667470 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.668908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.715870 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.721395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.721499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.721518 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9cb\" (UniqueName: \"kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.823184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.823553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.823637 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9cb\" (UniqueName: \"kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.823777 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.823944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:50 crc kubenswrapper[4755]: I0317 00:36:50.844010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9cb\" (UniqueName: \"kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb\") pod \"certified-operators-st2hj\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:51 crc kubenswrapper[4755]: I0317 00:36:51.097088 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:36:51 crc kubenswrapper[4755]: I0317 00:36:51.541475 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:36:51 crc kubenswrapper[4755]: W0317 00:36:51.544636 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee313bc_66c1_45b8_aac7_5d7269a90abf.slice/crio-5357a7364b3891535be25fcff71b80cd982d41cda72546c2e87e36617c3e8dec WatchSource:0}: Error finding container 5357a7364b3891535be25fcff71b80cd982d41cda72546c2e87e36617c3e8dec: Status 404 returned error can't find the container with id 5357a7364b3891535be25fcff71b80cd982d41cda72546c2e87e36617c3e8dec Mar 17 00:36:51 crc kubenswrapper[4755]: I0317 00:36:51.573519 4755 generic.go:334] "Generic (PLEG): container finished" podID="37f8378b-28ae-4421-8927-d1715aad050c" containerID="6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72" exitCode=0 Mar 17 00:36:51 crc kubenswrapper[4755]: I0317 00:36:51.573575 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerDied","Data":"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72"} Mar 17 00:36:51 crc kubenswrapper[4755]: I0317 00:36:51.574987 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerStarted","Data":"5357a7364b3891535be25fcff71b80cd982d41cda72546c2e87e36617c3e8dec"} Mar 17 00:36:52 crc kubenswrapper[4755]: I0317 00:36:52.583314 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerID="334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029" exitCode=0 Mar 17 00:36:52 crc kubenswrapper[4755]: I0317 00:36:52.583379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerDied","Data":"334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029"} Mar 17 00:36:52 crc kubenswrapper[4755]: I0317 00:36:52.586280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerStarted","Data":"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2"} Mar 17 00:36:52 crc kubenswrapper[4755]: I0317 00:36:52.624674 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xg2pg" podStartSLOduration=3.156848981 podStartE2EDuration="5.624660282s" podCreationTimestamp="2026-03-17 00:36:47 +0000 UTC" firstStartedPulling="2026-03-17 00:36:49.558783929 +0000 UTC m=+884.318236212" lastFinishedPulling="2026-03-17 00:36:52.02659523 +0000 UTC m=+886.786047513" observedRunningTime="2026-03-17 00:36:52.622764318 +0000 UTC m=+887.382216621" watchObservedRunningTime="2026-03-17 00:36:52.624660282 +0000 UTC m=+887.384112565" Mar 17 00:36:53 crc kubenswrapper[4755]: I0317 00:36:53.595101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerStarted","Data":"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277"} Mar 17 00:36:54 crc kubenswrapper[4755]: I0317 00:36:54.603930 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerID="7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277" exitCode=0 Mar 17 00:36:54 crc kubenswrapper[4755]: I0317 00:36:54.603971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerDied","Data":"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277"} Mar 17 00:36:56 crc kubenswrapper[4755]: I0317 00:36:56.621224 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerStarted","Data":"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b"} Mar 17 00:36:56 crc kubenswrapper[4755]: I0317 00:36:56.641084 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-st2hj" podStartSLOduration=3.318701179 podStartE2EDuration="6.641068645s" podCreationTimestamp="2026-03-17 00:36:50 +0000 UTC" firstStartedPulling="2026-03-17 00:36:52.585153549 +0000 UTC m=+887.344605872" lastFinishedPulling="2026-03-17 00:36:55.907521035 +0000 UTC m=+890.666973338" observedRunningTime="2026-03-17 00:36:56.63587649 +0000 UTC m=+891.395328773" watchObservedRunningTime="2026-03-17 00:36:56.641068645 +0000 UTC m=+891.400520918" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.049790 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-95f86" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.117295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-crzs4" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.222471 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.222523 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.274394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.285821 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-bh66s" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.665644 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.666020 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.666079 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.666972 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.667071 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1" gracePeriod=600 Mar 17 00:36:58 crc kubenswrapper[4755]: I0317 00:36:58.702368 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.091855 4755 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.091943 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c516523b-4c3b-4083-a8f5-18c9061c7032" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.141195 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.333692 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.447942 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.649296 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1" exitCode=0 Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.650354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1"} Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.650481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0"} Mar 17 00:36:59 crc kubenswrapper[4755]: I0317 00:36:59.650512 4755 scope.go:117] "RemoveContainer" containerID="df05c9b2eac57c85d2adcda116412d2685fa9b7be3d9227fbab0e788267e2675" Mar 17 00:37:00 crc kubenswrapper[4755]: I0317 00:37:00.662180 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xg2pg" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="registry-server" containerID="cri-o://2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2" gracePeriod=2 Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.097831 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.098036 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.145817 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.175156 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.264413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content\") pod \"37f8378b-28ae-4421-8927-d1715aad050c\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.264568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities\") pod \"37f8378b-28ae-4421-8927-d1715aad050c\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.264750 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgqj\" (UniqueName: \"kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj\") pod \"37f8378b-28ae-4421-8927-d1715aad050c\" (UID: \"37f8378b-28ae-4421-8927-d1715aad050c\") " Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.266516 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities" (OuterVolumeSpecName: "utilities") pod "37f8378b-28ae-4421-8927-d1715aad050c" (UID: "37f8378b-28ae-4421-8927-d1715aad050c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.273766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj" (OuterVolumeSpecName: "kube-api-access-5hgqj") pod "37f8378b-28ae-4421-8927-d1715aad050c" (UID: "37f8378b-28ae-4421-8927-d1715aad050c"). InnerVolumeSpecName "kube-api-access-5hgqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.366329 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.366374 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgqj\" (UniqueName: \"kubernetes.io/projected/37f8378b-28ae-4421-8927-d1715aad050c-kube-api-access-5hgqj\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.674627 4755 generic.go:334] "Generic (PLEG): container finished" podID="37f8378b-28ae-4421-8927-d1715aad050c" containerID="2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2" exitCode=0 Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.674678 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerDied","Data":"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2"} Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.674724 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg2pg" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.674760 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg2pg" event={"ID":"37f8378b-28ae-4421-8927-d1715aad050c","Type":"ContainerDied","Data":"f0713254414951ed0817e321d5ff75769120bcf5afd32de7f246a44961854aa5"} Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.674785 4755 scope.go:117] "RemoveContainer" containerID="2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.681253 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37f8378b-28ae-4421-8927-d1715aad050c" (UID: "37f8378b-28ae-4421-8927-d1715aad050c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.701317 4755 scope.go:117] "RemoveContainer" containerID="6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.722589 4755 scope.go:117] "RemoveContainer" containerID="2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.744136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.761008 4755 scope.go:117] "RemoveContainer" containerID="2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2" Mar 17 00:37:01 crc kubenswrapper[4755]: E0317 00:37:01.762829 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2\": container with ID starting with 2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2 not found: ID does not exist" containerID="2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.762866 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2"} err="failed to get container status \"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2\": rpc error: code = NotFound desc = could not find container \"2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2\": container with ID starting with 2132b40c6906dc2ea6463018a43c026526169fa087724f09d67d139a177f21a2 not found: ID does not exist" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.762890 4755 scope.go:117] "RemoveContainer" containerID="6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72" Mar 17 00:37:01 crc kubenswrapper[4755]: E0317 00:37:01.763278 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72\": container with ID starting with 6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72 not found: ID does not exist" containerID="6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.763325 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72"} err="failed to get container status \"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72\": rpc error: code = NotFound desc = could not find container \"6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72\": container with ID starting with 6d42f7aca2f8cf6349866b82ea28f319260a2da0140331433dbe51694554bb72 not found: ID does not exist" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.763359 4755 scope.go:117] "RemoveContainer" containerID="2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b" Mar 17 00:37:01 crc kubenswrapper[4755]: E0317 00:37:01.763661 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b\": container with ID starting with 2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b not found: ID does not exist" containerID="2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.763694 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b"} err="failed to get container status \"2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b\": rpc error: code = NotFound desc = could not find container \"2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b\": container with ID starting with 2a6e78851f0d7176b34a8424a7256f706cd7d208f51e7882f156cc7516b18e4b not found: ID does not exist" Mar 17 00:37:01 crc kubenswrapper[4755]: I0317 00:37:01.773679 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f8378b-28ae-4421-8927-d1715aad050c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:02 crc kubenswrapper[4755]: I0317 00:37:02.003201 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:37:02 crc kubenswrapper[4755]: I0317 00:37:02.009153 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xg2pg"] Mar 17 00:37:02 crc kubenswrapper[4755]: I0317 00:37:02.262036 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f8378b-28ae-4421-8927-d1715aad050c" path="/var/lib/kubelet/pods/37f8378b-28ae-4421-8927-d1715aad050c/volumes" Mar 17 00:37:02 crc kubenswrapper[4755]: I0317 00:37:02.452410 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:37:03 crc kubenswrapper[4755]: I0317 00:37:03.692090 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-st2hj" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="registry-server" containerID="cri-o://8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b" gracePeriod=2 Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.126255 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.309587 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks9cb\" (UniqueName: \"kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb\") pod \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.309652 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities\") pod \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.309711 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content\") pod \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\" (UID: \"2ee313bc-66c1-45b8-aac7-5d7269a90abf\") " Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.310860 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities" (OuterVolumeSpecName: "utilities") pod "2ee313bc-66c1-45b8-aac7-5d7269a90abf" (UID: "2ee313bc-66c1-45b8-aac7-5d7269a90abf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.316750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb" (OuterVolumeSpecName: "kube-api-access-ks9cb") pod "2ee313bc-66c1-45b8-aac7-5d7269a90abf" (UID: "2ee313bc-66c1-45b8-aac7-5d7269a90abf"). InnerVolumeSpecName "kube-api-access-ks9cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.411322 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks9cb\" (UniqueName: \"kubernetes.io/projected/2ee313bc-66c1-45b8-aac7-5d7269a90abf-kube-api-access-ks9cb\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.411365 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.703900 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerID="8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b" exitCode=0 Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.703959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerDied","Data":"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b"} Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.703995 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-st2hj" event={"ID":"2ee313bc-66c1-45b8-aac7-5d7269a90abf","Type":"ContainerDied","Data":"5357a7364b3891535be25fcff71b80cd982d41cda72546c2e87e36617c3e8dec"} Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.703997 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-st2hj" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.704050 4755 scope.go:117] "RemoveContainer" containerID="8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.730887 4755 scope.go:117] "RemoveContainer" containerID="7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.756087 4755 scope.go:117] "RemoveContainer" containerID="334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.785228 4755 scope.go:117] "RemoveContainer" containerID="8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b" Mar 17 00:37:04 crc kubenswrapper[4755]: E0317 00:37:04.798724 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b\": container with ID starting with 8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b not found: ID does not exist" containerID="8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.798768 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b"} err="failed to get container status \"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b\": rpc error: code = NotFound desc = could not find container \"8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b\": container with ID starting with 8d31a6e1bf3bad99bbee4deda0a447a94197c2a5aa66c4792e37398ea5b1a46b not found: ID does not exist" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.798796 4755 scope.go:117] "RemoveContainer" containerID="7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277" Mar 17 00:37:04 crc kubenswrapper[4755]: E0317 00:37:04.799362 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277\": container with ID starting with 7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277 not found: ID does not exist" containerID="7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.799393 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277"} err="failed to get container status \"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277\": rpc error: code = NotFound desc = could not find container \"7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277\": container with ID starting with 7fb58fef1569c42cd1c44fcf8b0b85feea82c3bdaa29a6a64d4d722bdb7a0277 not found: ID does not exist" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.799408 4755 scope.go:117] "RemoveContainer" containerID="334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029" Mar 17 00:37:04 crc kubenswrapper[4755]: E0317 00:37:04.799823 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029\": container with ID starting with 334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029 not found: ID does not exist" containerID="334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.799889 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029"} err="failed to get container status \"334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029\": rpc error: code = NotFound desc = could not find container \"334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029\": container with ID starting with 334e53599adc26c291658f1eae42fcac5f02d5328575e78e906e187bceb6e029 not found: ID does not exist" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.891609 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ee313bc-66c1-45b8-aac7-5d7269a90abf" (UID: "2ee313bc-66c1-45b8-aac7-5d7269a90abf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:37:04 crc kubenswrapper[4755]: I0317 00:37:04.920810 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee313bc-66c1-45b8-aac7-5d7269a90abf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:05 crc kubenswrapper[4755]: I0317 00:37:05.063766 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:37:05 crc kubenswrapper[4755]: I0317 00:37:05.069880 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-st2hj"] Mar 17 00:37:06 crc kubenswrapper[4755]: I0317 00:37:06.259418 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" path="/var/lib/kubelet/pods/2ee313bc-66c1-45b8-aac7-5d7269a90abf/volumes" Mar 17 00:37:09 crc kubenswrapper[4755]: I0317 00:37:09.084993 4755 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 17 00:37:09 crc kubenswrapper[4755]: I0317 00:37:09.085254 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c516523b-4c3b-4083-a8f5-18c9061c7032" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 17 00:37:19 crc kubenswrapper[4755]: I0317 00:37:19.086590 4755 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 17 00:37:19 crc kubenswrapper[4755]: I0317 00:37:19.088698 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c516523b-4c3b-4083-a8f5-18c9061c7032" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 17 00:37:29 crc kubenswrapper[4755]: I0317 00:37:29.086002 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.910751 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-mqmwg"] Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.912911 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="extract-content" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.912951 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="extract-content" Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.912975 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="extract-content" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.912988 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="extract-content" Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.913006 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.913037 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913049 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.913069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="extract-utilities" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913083 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="extract-utilities" Mar 17 00:37:47 crc kubenswrapper[4755]: E0317 00:37:47.913107 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="extract-utilities" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913119 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="extract-utilities" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913365 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee313bc-66c1-45b8-aac7-5d7269a90abf" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.913386 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f8378b-28ae-4421-8927-d1715aad050c" containerName="registry-server" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.914211 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.917111 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-94bjc" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.917914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.918290 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.918846 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.922029 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.929496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.929664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.929772 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.929887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.930028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.930155 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.930606 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.930789 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.930846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.931004 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j742b\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.931081 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.931167 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 17 00:37:47 crc kubenswrapper[4755]: I0317 00:37:47.947896 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-mqmwg"] Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j742b\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032427 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032474 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032501 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032528 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032572 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: E0317 00:37:48.032585 4755 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032609 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: E0317 00:37:48.032664 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver podName:0f3b1d86-aba4-468d-b375-53fdc638325c nodeName:}" failed. No retries permitted until 2026-03-17 00:37:48.532640527 +0000 UTC m=+943.292092820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver") pod "collector-mqmwg" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c") : secret "collector-syslog-receiver" not found Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.032930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: E0317 00:37:48.033053 4755 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 17 00:37:48 crc kubenswrapper[4755]: E0317 00:37:48.033176 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics podName:0f3b1d86-aba4-468d-b375-53fdc638325c nodeName:}" failed. No retries permitted until 2026-03-17 00:37:48.533148751 +0000 UTC m=+943.292601074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics") pod "collector-mqmwg" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c") : secret "collector-metrics" not found Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.033822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.033895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.034275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.034828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.041307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.043632 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.054886 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.059766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j742b\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.091357 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mqmwg"] Mar 17 00:37:48 crc kubenswrapper[4755]: E0317 00:37:48.092258 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver metrics], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-mqmwg" podUID="0f3b1d86-aba4-468d-b375-53fdc638325c" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.352419 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.363802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.437600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.437680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.437741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.437801 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.438626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.438782 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config" (OuterVolumeSpecName: "config") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.438989 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.442973 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token" (OuterVolumeSpecName: "collector-token") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539814 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir" (OuterVolumeSpecName: "datadir") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j742b\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.539918 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540369 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540659 4755 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/0f3b1d86-aba4-468d-b375-53fdc638325c-datadir\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540694 4755 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540725 4755 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540748 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.540804 4755 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.544111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.544758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b" (OuterVolumeSpecName: "kube-api-access-j742b") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "kube-api-access-j742b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.545969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token" (OuterVolumeSpecName: "sa-token") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.546206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") pod \"collector-mqmwg\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " pod="openshift-logging/collector-mqmwg" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.551391 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp" (OuterVolumeSpecName: "tmp") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642154 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") pod \"0f3b1d86-aba4-468d-b375-53fdc638325c\" (UID: \"0f3b1d86-aba4-468d-b375-53fdc638325c\") " Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642739 4755 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642774 4755 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f3b1d86-aba4-468d-b375-53fdc638325c-tmp\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642794 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j742b\" (UniqueName: \"kubernetes.io/projected/0f3b1d86-aba4-468d-b375-53fdc638325c-kube-api-access-j742b\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.642814 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3b1d86-aba4-468d-b375-53fdc638325c-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.646890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.647044 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics" (OuterVolumeSpecName: "metrics") pod "0f3b1d86-aba4-468d-b375-53fdc638325c" (UID: "0f3b1d86-aba4-468d-b375-53fdc638325c"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.744122 4755 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:48 crc kubenswrapper[4755]: I0317 00:37:48.744176 4755 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/0f3b1d86-aba4-468d-b375-53fdc638325c-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.361218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mqmwg" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.442922 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mqmwg"] Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.449075 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-mqmwg"] Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.455128 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-tqf58"] Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.456290 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.465724 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-94bjc" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.466783 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.467853 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.468130 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.468755 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.472931 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.476656 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-tqf58"] Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.556805 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-sa-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.556868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d7mc\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-kube-api-access-9d7mc\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.556910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.556934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-syslog-receiver\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557030 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/50d6f059-2e1c-4ac4-9952-dcbab62b23db-datadir\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557055 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557072 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-trusted-ca\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-entrypoint\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557116 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-metrics\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557137 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d6f059-2e1c-4ac4-9952-dcbab62b23db-tmp\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.557193 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config-openshift-service-cacrt\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config-openshift-service-cacrt\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-sa-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d7mc\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-kube-api-access-9d7mc\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658287 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658304 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-syslog-receiver\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/50d6f059-2e1c-4ac4-9952-dcbab62b23db-datadir\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-trusted-ca\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-entrypoint\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658410 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-metrics\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658425 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d6f059-2e1c-4ac4-9952-dcbab62b23db-tmp\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.658904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/50d6f059-2e1c-4ac4-9952-dcbab62b23db-datadir\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.661360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config-openshift-service-cacrt\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.662596 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-entrypoint\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.662721 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-trusted-ca\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.662793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50d6f059-2e1c-4ac4-9952-dcbab62b23db-config\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.665562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d6f059-2e1c-4ac4-9952-dcbab62b23db-tmp\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.667004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-metrics\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.667173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.676256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/50d6f059-2e1c-4ac4-9952-dcbab62b23db-collector-syslog-receiver\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.680854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-sa-token\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.687531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d7mc\" (UniqueName: \"kubernetes.io/projected/50d6f059-2e1c-4ac4-9952-dcbab62b23db-kube-api-access-9d7mc\") pod \"collector-tqf58\" (UID: \"50d6f059-2e1c-4ac4-9952-dcbab62b23db\") " pod="openshift-logging/collector-tqf58" Mar 17 00:37:49 crc kubenswrapper[4755]: I0317 00:37:49.803007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-tqf58" Mar 17 00:37:50 crc kubenswrapper[4755]: I0317 00:37:50.267225 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3b1d86-aba4-468d-b375-53fdc638325c" path="/var/lib/kubelet/pods/0f3b1d86-aba4-468d-b375-53fdc638325c/volumes" Mar 17 00:37:50 crc kubenswrapper[4755]: I0317 00:37:50.313947 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-tqf58"] Mar 17 00:37:50 crc kubenswrapper[4755]: I0317 00:37:50.370789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-tqf58" event={"ID":"50d6f059-2e1c-4ac4-9952-dcbab62b23db","Type":"ContainerStarted","Data":"d22d2b724de0eb5cc94620c4182ec47e2ec9cc2f4214d04669a1667405a92743"} Mar 17 00:37:54 crc kubenswrapper[4755]: I0317 00:37:54.404971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-tqf58" event={"ID":"50d6f059-2e1c-4ac4-9952-dcbab62b23db","Type":"ContainerStarted","Data":"bdcc686b453030cca447c25a73856b9a4bc995da745351dfdb02171bd60086dc"} Mar 17 00:37:54 crc kubenswrapper[4755]: I0317 00:37:54.440480 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-tqf58" podStartSLOduration=2.022289492 podStartE2EDuration="5.440411509s" podCreationTimestamp="2026-03-17 00:37:49 +0000 UTC" firstStartedPulling="2026-03-17 00:37:50.326148276 +0000 UTC m=+945.085600599" lastFinishedPulling="2026-03-17 00:37:53.744270313 +0000 UTC m=+948.503722616" observedRunningTime="2026-03-17 00:37:54.435622756 +0000 UTC m=+949.195075049" watchObservedRunningTime="2026-03-17 00:37:54.440411509 +0000 UTC m=+949.199863802" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.155947 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561798-htkqk"] Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.158059 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.162164 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.162500 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.162541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.169944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561798-htkqk"] Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.242864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whf6z\" (UniqueName: \"kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z\") pod \"auto-csr-approver-29561798-htkqk\" (UID: \"5aaedc1e-542e-4f33-acf6-a5de25bdedef\") " pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.344222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whf6z\" (UniqueName: \"kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z\") pod \"auto-csr-approver-29561798-htkqk\" (UID: \"5aaedc1e-542e-4f33-acf6-a5de25bdedef\") " pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.377733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whf6z\" (UniqueName: \"kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z\") pod \"auto-csr-approver-29561798-htkqk\" (UID: \"5aaedc1e-542e-4f33-acf6-a5de25bdedef\") " pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.498997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:00 crc kubenswrapper[4755]: I0317 00:38:00.773490 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561798-htkqk"] Mar 17 00:38:01 crc kubenswrapper[4755]: I0317 00:38:01.461481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561798-htkqk" event={"ID":"5aaedc1e-542e-4f33-acf6-a5de25bdedef","Type":"ContainerStarted","Data":"dc3ea57b0375ea90c9aa25b688d5ef26402c425200082778f3771c2dcb570b42"} Mar 17 00:38:02 crc kubenswrapper[4755]: I0317 00:38:02.473764 4755 generic.go:334] "Generic (PLEG): container finished" podID="5aaedc1e-542e-4f33-acf6-a5de25bdedef" containerID="c026cc0fcb16c816bcd27db3d03977063a39b40bdead9e159d5e0f9d731a67cb" exitCode=0 Mar 17 00:38:02 crc kubenswrapper[4755]: I0317 00:38:02.473915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561798-htkqk" event={"ID":"5aaedc1e-542e-4f33-acf6-a5de25bdedef","Type":"ContainerDied","Data":"c026cc0fcb16c816bcd27db3d03977063a39b40bdead9e159d5e0f9d731a67cb"} Mar 17 00:38:03 crc kubenswrapper[4755]: I0317 00:38:03.828172 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:03 crc kubenswrapper[4755]: I0317 00:38:03.918934 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whf6z\" (UniqueName: \"kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z\") pod \"5aaedc1e-542e-4f33-acf6-a5de25bdedef\" (UID: \"5aaedc1e-542e-4f33-acf6-a5de25bdedef\") " Mar 17 00:38:03 crc kubenswrapper[4755]: I0317 00:38:03.931635 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z" (OuterVolumeSpecName: "kube-api-access-whf6z") pod "5aaedc1e-542e-4f33-acf6-a5de25bdedef" (UID: "5aaedc1e-542e-4f33-acf6-a5de25bdedef"). InnerVolumeSpecName "kube-api-access-whf6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.020733 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whf6z\" (UniqueName: \"kubernetes.io/projected/5aaedc1e-542e-4f33-acf6-a5de25bdedef-kube-api-access-whf6z\") on node \"crc\" DevicePath \"\"" Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.492174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561798-htkqk" event={"ID":"5aaedc1e-542e-4f33-acf6-a5de25bdedef","Type":"ContainerDied","Data":"dc3ea57b0375ea90c9aa25b688d5ef26402c425200082778f3771c2dcb570b42"} Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.492241 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3ea57b0375ea90c9aa25b688d5ef26402c425200082778f3771c2dcb570b42" Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.492246 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561798-htkqk" Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.931408 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561792-fc6gf"] Mar 17 00:38:04 crc kubenswrapper[4755]: I0317 00:38:04.935360 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561792-fc6gf"] Mar 17 00:38:06 crc kubenswrapper[4755]: I0317 00:38:06.258504 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503c259e-cfa2-4b5d-a866-83a78794df20" path="/var/lib/kubelet/pods/503c259e-cfa2-4b5d-a866-83a78794df20/volumes" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.504997 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22"] Mar 17 00:38:20 crc kubenswrapper[4755]: E0317 00:38:20.505859 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaedc1e-542e-4f33-acf6-a5de25bdedef" containerName="oc" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.505878 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaedc1e-542e-4f33-acf6-a5de25bdedef" containerName="oc" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.506020 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aaedc1e-542e-4f33-acf6-a5de25bdedef" containerName="oc" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.506985 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.509124 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.519944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22"] Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.613353 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.613478 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpck5\" (UniqueName: \"kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.613619 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.715182 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpck5\" (UniqueName: \"kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.715274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.715303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.715767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.716030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.741215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpck5\" (UniqueName: \"kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:20 crc kubenswrapper[4755]: I0317 00:38:20.831164 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:21 crc kubenswrapper[4755]: I0317 00:38:21.311454 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22"] Mar 17 00:38:21 crc kubenswrapper[4755]: I0317 00:38:21.640598 4755 generic.go:334] "Generic (PLEG): container finished" podID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerID="495493f02f65b71947725a8e96afe21313a157fedfe67cf4c4ec6cf51765ec2d" exitCode=0 Mar 17 00:38:21 crc kubenswrapper[4755]: I0317 00:38:21.640640 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerDied","Data":"495493f02f65b71947725a8e96afe21313a157fedfe67cf4c4ec6cf51765ec2d"} Mar 17 00:38:21 crc kubenswrapper[4755]: I0317 00:38:21.640663 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerStarted","Data":"1365250b4d420ded7b7d5f4e57db3673e503d836d0e07f921b88cf85a59bd199"} Mar 17 00:38:23 crc kubenswrapper[4755]: I0317 00:38:23.074015 4755 scope.go:117] "RemoveContainer" containerID="09331908c036e53ce5955a3f651296527cc0fc2cd6b6aae8569f51800528cb91" Mar 17 00:38:23 crc kubenswrapper[4755]: I0317 00:38:23.656688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerStarted","Data":"eaf2a943c872a01aaa90652eda45dd50b4c60d9a0317c71618d8c17ed6a7ba2e"} Mar 17 00:38:24 crc kubenswrapper[4755]: I0317 00:38:24.666685 4755 generic.go:334] "Generic (PLEG): container finished" podID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerID="eaf2a943c872a01aaa90652eda45dd50b4c60d9a0317c71618d8c17ed6a7ba2e" exitCode=0 Mar 17 00:38:24 crc kubenswrapper[4755]: I0317 00:38:24.666956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerDied","Data":"eaf2a943c872a01aaa90652eda45dd50b4c60d9a0317c71618d8c17ed6a7ba2e"} Mar 17 00:38:25 crc kubenswrapper[4755]: I0317 00:38:25.675915 4755 generic.go:334] "Generic (PLEG): container finished" podID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerID="35bc51905d32a340f18444ba627083abde3f4fbd20cf99b812fbe12c6e1a62b8" exitCode=0 Mar 17 00:38:25 crc kubenswrapper[4755]: I0317 00:38:25.676008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerDied","Data":"35bc51905d32a340f18444ba627083abde3f4fbd20cf99b812fbe12c6e1a62b8"} Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.006729 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.115045 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpck5\" (UniqueName: \"kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5\") pod \"53a160ea-9625-4d20-82ab-cae78c0c4911\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.115102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle\") pod \"53a160ea-9625-4d20-82ab-cae78c0c4911\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.115156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util\") pod \"53a160ea-9625-4d20-82ab-cae78c0c4911\" (UID: \"53a160ea-9625-4d20-82ab-cae78c0c4911\") " Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.117895 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle" (OuterVolumeSpecName: "bundle") pod "53a160ea-9625-4d20-82ab-cae78c0c4911" (UID: "53a160ea-9625-4d20-82ab-cae78c0c4911"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.122356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5" (OuterVolumeSpecName: "kube-api-access-jpck5") pod "53a160ea-9625-4d20-82ab-cae78c0c4911" (UID: "53a160ea-9625-4d20-82ab-cae78c0c4911"). InnerVolumeSpecName "kube-api-access-jpck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.216856 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpck5\" (UniqueName: \"kubernetes.io/projected/53a160ea-9625-4d20-82ab-cae78c0c4911-kube-api-access-jpck5\") on node \"crc\" DevicePath \"\"" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.216954 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.221210 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util" (OuterVolumeSpecName: "util") pod "53a160ea-9625-4d20-82ab-cae78c0c4911" (UID: "53a160ea-9625-4d20-82ab-cae78c0c4911"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.318549 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53a160ea-9625-4d20-82ab-cae78c0c4911-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.694854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" event={"ID":"53a160ea-9625-4d20-82ab-cae78c0c4911","Type":"ContainerDied","Data":"1365250b4d420ded7b7d5f4e57db3673e503d836d0e07f921b88cf85a59bd199"} Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.694929 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1365250b4d420ded7b7d5f4e57db3673e503d836d0e07f921b88cf85a59bd199" Mar 17 00:38:27 crc kubenswrapper[4755]: I0317 00:38:27.695026 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.324662 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq"] Mar 17 00:38:32 crc kubenswrapper[4755]: E0317 00:38:32.325784 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="extract" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.325879 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="extract" Mar 17 00:38:32 crc kubenswrapper[4755]: E0317 00:38:32.325975 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="util" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.326056 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="util" Mar 17 00:38:32 crc kubenswrapper[4755]: E0317 00:38:32.326143 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="pull" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.326214 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="pull" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.326422 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a160ea-9625-4d20-82ab-cae78c0c4911" containerName="extract" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.327047 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.329181 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.331581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6sjfg" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.331893 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.351205 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq"] Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.495230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jr9b\" (UniqueName: \"kubernetes.io/projected/8905f140-6bfd-4be5-89dd-3db46bdcc933-kube-api-access-7jr9b\") pod \"nmstate-operator-796d4cfff4-qkhgq\" (UID: \"8905f140-6bfd-4be5-89dd-3db46bdcc933\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.597008 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jr9b\" (UniqueName: \"kubernetes.io/projected/8905f140-6bfd-4be5-89dd-3db46bdcc933-kube-api-access-7jr9b\") pod \"nmstate-operator-796d4cfff4-qkhgq\" (UID: \"8905f140-6bfd-4be5-89dd-3db46bdcc933\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.621937 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jr9b\" (UniqueName: \"kubernetes.io/projected/8905f140-6bfd-4be5-89dd-3db46bdcc933-kube-api-access-7jr9b\") pod \"nmstate-operator-796d4cfff4-qkhgq\" (UID: \"8905f140-6bfd-4be5-89dd-3db46bdcc933\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.663730 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" Mar 17 00:38:32 crc kubenswrapper[4755]: I0317 00:38:32.888937 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq"] Mar 17 00:38:33 crc kubenswrapper[4755]: I0317 00:38:33.738158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" event={"ID":"8905f140-6bfd-4be5-89dd-3db46bdcc933","Type":"ContainerStarted","Data":"6751b66542d83cab07a0823e26f2024dbf43e4085573f5c0cda17d9664522b77"} Mar 17 00:38:35 crc kubenswrapper[4755]: I0317 00:38:35.755589 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" event={"ID":"8905f140-6bfd-4be5-89dd-3db46bdcc933","Type":"ContainerStarted","Data":"681134548d843f5d6dc4a3c0bbf3a5ec51396a0ba545b7b0e8d00dcf99ff35f8"} Mar 17 00:38:35 crc kubenswrapper[4755]: I0317 00:38:35.791817 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-qkhgq" podStartSLOduration=1.302939351 podStartE2EDuration="3.791788755s" podCreationTimestamp="2026-03-17 00:38:32 +0000 UTC" firstStartedPulling="2026-03-17 00:38:32.897478924 +0000 UTC m=+987.656931207" lastFinishedPulling="2026-03-17 00:38:35.386328338 +0000 UTC m=+990.145780611" observedRunningTime="2026-03-17 00:38:35.784176311 +0000 UTC m=+990.543628634" watchObservedRunningTime="2026-03-17 00:38:35.791788755 +0000 UTC m=+990.551241078" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.790881 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.792683 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.795055 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-b6hsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.805399 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-q892t"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.806526 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.808323 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.809890 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.827799 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wvzsg"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.828801 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.843702 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-q892t"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.933780 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p"] Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.934680 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.937668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.938056 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.938088 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-84mdx" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944463 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-ovs-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-nmstate-lock\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944537 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jwn\" (UniqueName: \"kubernetes.io/projected/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-kube-api-access-d4jwn\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944577 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-dbus-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbs94\" (UniqueName: \"kubernetes.io/projected/1632175c-4118-4c14-b3ef-59472c846d04-kube-api-access-fbs94\") pod \"nmstate-metrics-9b8c8685d-5ml2z\" (UID: \"1632175c-4118-4c14-b3ef-59472c846d04\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ql2\" (UniqueName: \"kubernetes.io/projected/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-kube-api-access-j4ql2\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:41 crc kubenswrapper[4755]: I0317 00:38:41.944551 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p"] Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046363 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33b5469d-e555-44f7-8f84-dcea89debdae-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33b5469d-e555-44f7-8f84-dcea89debdae-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ql2\" (UniqueName: \"kubernetes.io/projected/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-kube-api-access-j4ql2\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-ovs-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-nmstate-lock\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jwn\" (UniqueName: \"kubernetes.io/projected/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-kube-api-access-d4jwn\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-dbus-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-ovs-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046700 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-nmstate-lock\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5fr\" (UniqueName: \"kubernetes.io/projected/33b5469d-e555-44f7-8f84-dcea89debdae-kube-api-access-ng5fr\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbs94\" (UniqueName: \"kubernetes.io/projected/1632175c-4118-4c14-b3ef-59472c846d04-kube-api-access-fbs94\") pod \"nmstate-metrics-9b8c8685d-5ml2z\" (UID: \"1632175c-4118-4c14-b3ef-59472c846d04\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.046925 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-dbus-socket\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.067822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.074163 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jwn\" (UniqueName: \"kubernetes.io/projected/ae6a47ef-9bd5-4c94-8ae3-966b11c8506b-kube-api-access-d4jwn\") pod \"nmstate-webhook-5f558f5558-q892t\" (UID: \"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.081747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ql2\" (UniqueName: \"kubernetes.io/projected/74f876f9-73dd-42eb-bc3c-8aa4e6dc854c-kube-api-access-j4ql2\") pod \"nmstate-handler-wvzsg\" (UID: \"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c\") " pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.088709 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbs94\" (UniqueName: \"kubernetes.io/projected/1632175c-4118-4c14-b3ef-59472c846d04-kube-api-access-fbs94\") pod \"nmstate-metrics-9b8c8685d-5ml2z\" (UID: \"1632175c-4118-4c14-b3ef-59472c846d04\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.122078 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.128316 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.129115 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.134847 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.150692 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5fr\" (UniqueName: \"kubernetes.io/projected/33b5469d-e555-44f7-8f84-dcea89debdae-kube-api-access-ng5fr\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.150752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33b5469d-e555-44f7-8f84-dcea89debdae-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.150770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33b5469d-e555-44f7-8f84-dcea89debdae-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.154125 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.155478 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/33b5469d-e555-44f7-8f84-dcea89debdae-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.155477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/33b5469d-e555-44f7-8f84-dcea89debdae-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.156140 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.180253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5fr\" (UniqueName: \"kubernetes.io/projected/33b5469d-e555-44f7-8f84-dcea89debdae-kube-api-access-ng5fr\") pod \"nmstate-console-plugin-86f58fcf4-6f24p\" (UID: \"33b5469d-e555-44f7-8f84-dcea89debdae\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.251859 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252274 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.252530 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l5h6\" (UniqueName: \"kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.353712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354062 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354104 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l5h6\" (UniqueName: \"kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.354925 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.355172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.359180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.361074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.364108 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.371710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l5h6\" (UniqueName: \"kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6\") pod \"console-cf589d4bf-9httx\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.527091 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.589961 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z"] Mar 17 00:38:42 crc kubenswrapper[4755]: W0317 00:38:42.596209 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1632175c_4118_4c14_b3ef_59472c846d04.slice/crio-d9ee8dc055b05283a82a20c87e49ca0879e1eaaf069884296661334213f4bef5 WatchSource:0}: Error finding container d9ee8dc055b05283a82a20c87e49ca0879e1eaaf069884296661334213f4bef5: Status 404 returned error can't find the container with id d9ee8dc055b05283a82a20c87e49ca0879e1eaaf069884296661334213f4bef5 Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.647220 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-q892t"] Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.699065 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p"] Mar 17 00:38:42 crc kubenswrapper[4755]: W0317 00:38:42.713404 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b5469d_e555_44f7_8f84_dcea89debdae.slice/crio-34f6e03fb7b91ab9d032d0d6e42b70fbb1144db3c289c27e8b4fce9ad24c684d WatchSource:0}: Error finding container 34f6e03fb7b91ab9d032d0d6e42b70fbb1144db3c289c27e8b4fce9ad24c684d: Status 404 returned error can't find the container with id 34f6e03fb7b91ab9d032d0d6e42b70fbb1144db3c289c27e8b4fce9ad24c684d Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.757590 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.807967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" event={"ID":"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b","Type":"ContainerStarted","Data":"adbbcc88c69eea85733c7e914ed287256aa4b5e2542f3c110b32aea6fb0ecf9d"} Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.809010 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" event={"ID":"1632175c-4118-4c14-b3ef-59472c846d04","Type":"ContainerStarted","Data":"d9ee8dc055b05283a82a20c87e49ca0879e1eaaf069884296661334213f4bef5"} Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.810247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf589d4bf-9httx" event={"ID":"22b37f74-59b5-4148-9e19-92e3bab357c7","Type":"ContainerStarted","Data":"2fea5be37b5dd1486d8e087f016f1216a57207b9ce45840071ef9cd653ee17c0"} Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.811358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" event={"ID":"33b5469d-e555-44f7-8f84-dcea89debdae","Type":"ContainerStarted","Data":"34f6e03fb7b91ab9d032d0d6e42b70fbb1144db3c289c27e8b4fce9ad24c684d"} Mar 17 00:38:42 crc kubenswrapper[4755]: I0317 00:38:42.812593 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvzsg" event={"ID":"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c","Type":"ContainerStarted","Data":"5024647fe133b79b36150324bbb9282fb0ffdb95e463748f2e6a44ef34099120"} Mar 17 00:38:43 crc kubenswrapper[4755]: I0317 00:38:43.821883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf589d4bf-9httx" event={"ID":"22b37f74-59b5-4148-9e19-92e3bab357c7","Type":"ContainerStarted","Data":"0992d224f7ba4a8fa7acc79822891f61afebd3f1cd4b73f8742c69a10570fc8f"} Mar 17 00:38:43 crc kubenswrapper[4755]: I0317 00:38:43.851207 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cf589d4bf-9httx" podStartSLOduration=1.851184379 podStartE2EDuration="1.851184379s" podCreationTimestamp="2026-03-17 00:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:38:43.848696942 +0000 UTC m=+998.608149225" watchObservedRunningTime="2026-03-17 00:38:43.851184379 +0000 UTC m=+998.610636662" Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.846386 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" event={"ID":"1632175c-4118-4c14-b3ef-59472c846d04","Type":"ContainerStarted","Data":"0120f8b718e76dc7c0da9d08dc7a87807d299708882c2646cb153c7a262d51d2"} Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.848236 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" event={"ID":"ae6a47ef-9bd5-4c94-8ae3-966b11c8506b","Type":"ContainerStarted","Data":"24ec840217fe46cc0a66e67cdf16647ff90ff149d663ce67d8b6ab037e8c4773"} Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.848378 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.849422 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" event={"ID":"33b5469d-e555-44f7-8f84-dcea89debdae","Type":"ContainerStarted","Data":"979fe4ba89ad5c29899197413e2806c2586239846d8812e728f27a1ffb74400d"} Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.873475 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" podStartSLOduration=1.9543830660000001 podStartE2EDuration="4.873457177s" podCreationTimestamp="2026-03-17 00:38:41 +0000 UTC" firstStartedPulling="2026-03-17 00:38:42.651156795 +0000 UTC m=+997.410609098" lastFinishedPulling="2026-03-17 00:38:45.570230926 +0000 UTC m=+1000.329683209" observedRunningTime="2026-03-17 00:38:45.871093274 +0000 UTC m=+1000.630545567" watchObservedRunningTime="2026-03-17 00:38:45.873457177 +0000 UTC m=+1000.632909460" Mar 17 00:38:45 crc kubenswrapper[4755]: I0317 00:38:45.894002 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-6f24p" podStartSLOduration=2.043821492 podStartE2EDuration="4.893981645s" podCreationTimestamp="2026-03-17 00:38:41 +0000 UTC" firstStartedPulling="2026-03-17 00:38:42.715731046 +0000 UTC m=+997.475183329" lastFinishedPulling="2026-03-17 00:38:45.565891199 +0000 UTC m=+1000.325343482" observedRunningTime="2026-03-17 00:38:45.889614528 +0000 UTC m=+1000.649066821" watchObservedRunningTime="2026-03-17 00:38:45.893981645 +0000 UTC m=+1000.653433938" Mar 17 00:38:46 crc kubenswrapper[4755]: I0317 00:38:46.863838 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvzsg" event={"ID":"74f876f9-73dd-42eb-bc3c-8aa4e6dc854c","Type":"ContainerStarted","Data":"a31df4ac002d694b75812cd3950149f0e2bed83ca849b7a8861c32bf779b1a2f"} Mar 17 00:38:46 crc kubenswrapper[4755]: I0317 00:38:46.886171 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wvzsg" podStartSLOduration=2.52066914 podStartE2EDuration="5.886154318s" podCreationTimestamp="2026-03-17 00:38:41 +0000 UTC" firstStartedPulling="2026-03-17 00:38:42.204553492 +0000 UTC m=+996.964005775" lastFinishedPulling="2026-03-17 00:38:45.57003867 +0000 UTC m=+1000.329490953" observedRunningTime="2026-03-17 00:38:46.882139642 +0000 UTC m=+1001.641591925" watchObservedRunningTime="2026-03-17 00:38:46.886154318 +0000 UTC m=+1001.645606601" Mar 17 00:38:47 crc kubenswrapper[4755]: I0317 00:38:47.156179 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:48 crc kubenswrapper[4755]: I0317 00:38:48.881067 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" event={"ID":"1632175c-4118-4c14-b3ef-59472c846d04","Type":"ContainerStarted","Data":"ab34d32be4269d8e9383d1e725e8d91484c7d819a3baee22b7dfbe78d0e3525c"} Mar 17 00:38:48 crc kubenswrapper[4755]: I0317 00:38:48.901620 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-5ml2z" podStartSLOduration=2.080856398 podStartE2EDuration="7.901604016s" podCreationTimestamp="2026-03-17 00:38:41 +0000 UTC" firstStartedPulling="2026-03-17 00:38:42.598629534 +0000 UTC m=+997.358081817" lastFinishedPulling="2026-03-17 00:38:48.419377152 +0000 UTC m=+1003.178829435" observedRunningTime="2026-03-17 00:38:48.898566905 +0000 UTC m=+1003.658019188" watchObservedRunningTime="2026-03-17 00:38:48.901604016 +0000 UTC m=+1003.661056299" Mar 17 00:38:52 crc kubenswrapper[4755]: I0317 00:38:52.198949 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wvzsg" Mar 17 00:38:52 crc kubenswrapper[4755]: I0317 00:38:52.528675 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:52 crc kubenswrapper[4755]: I0317 00:38:52.528758 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:52 crc kubenswrapper[4755]: I0317 00:38:52.535640 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:52 crc kubenswrapper[4755]: I0317 00:38:52.924581 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:38:53 crc kubenswrapper[4755]: I0317 00:38:53.000190 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:38:58 crc kubenswrapper[4755]: I0317 00:38:58.664850 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:38:58 crc kubenswrapper[4755]: I0317 00:38:58.665605 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:39:02 crc kubenswrapper[4755]: I0317 00:39:02.143708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-q892t" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.046361 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2fcmx" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" containerID="cri-o://9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c" gracePeriod=15 Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.468831 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2fcmx_ddaf8b56-a560-4f69-8aa2-250b12ac4d4e/console/0.log" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.469197 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.517651 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.517703 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.517730 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87c9n\" (UniqueName: \"kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.517893 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.517919 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518043 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle\") pod \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\" (UID: \"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e\") " Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca" (OuterVolumeSpecName: "service-ca") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518848 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config" (OuterVolumeSpecName: "console-config") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.518931 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.523581 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.523576 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n" (OuterVolumeSpecName: "kube-api-access-87c9n") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "kube-api-access-87c9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.524092 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" (UID: "ddaf8b56-a560-4f69-8aa2-250b12ac4d4e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620337 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620414 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620475 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620493 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620512 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620561 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:18 crc kubenswrapper[4755]: I0317 00:39:18.620579 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87c9n\" (UniqueName: \"kubernetes.io/projected/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e-kube-api-access-87c9n\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139415 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2fcmx_ddaf8b56-a560-4f69-8aa2-250b12ac4d4e/console/0.log" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139483 4755 generic.go:334] "Generic (PLEG): container finished" podID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerID="9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c" exitCode=2 Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fcmx" event={"ID":"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e","Type":"ContainerDied","Data":"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c"} Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139550 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fcmx" event={"ID":"ddaf8b56-a560-4f69-8aa2-250b12ac4d4e","Type":"ContainerDied","Data":"72bb8e3c66943a62162884bee5ed444e1ec71deaa3b89e937765d426ed257e4f"} Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139572 4755 scope.go:117] "RemoveContainer" containerID="9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.139601 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fcmx" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.162166 4755 scope.go:117] "RemoveContainer" containerID="9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c" Mar 17 00:39:19 crc kubenswrapper[4755]: E0317 00:39:19.162613 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c\": container with ID starting with 9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c not found: ID does not exist" containerID="9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.162658 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c"} err="failed to get container status \"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c\": rpc error: code = NotFound desc = could not find container \"9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c\": container with ID starting with 9d48904bfcf964ae626188153365eabb6fd193f0bbbf6c8c9edfcdc19263ec6c not found: ID does not exist" Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.185295 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.191549 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2fcmx"] Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.291552 4755 patch_prober.go:28] interesting pod/console-f9d7485db-2fcmx container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 00:39:19 crc kubenswrapper[4755]: I0317 00:39:19.291621 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-2fcmx" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 00:39:20 crc kubenswrapper[4755]: I0317 00:39:20.255853 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" path="/var/lib/kubelet/pods/ddaf8b56-a560-4f69-8aa2-250b12ac4d4e/volumes" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.739610 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk"] Mar 17 00:39:22 crc kubenswrapper[4755]: E0317 00:39:22.740091 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.740102 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.740216 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddaf8b56-a560-4f69-8aa2-250b12ac4d4e" containerName="console" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.741154 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.743841 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.758095 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk"] Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.857988 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.858089 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmzlt\" (UniqueName: \"kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.858220 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.959565 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.959689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.959741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmzlt\" (UniqueName: \"kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.960422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.960513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:22 crc kubenswrapper[4755]: I0317 00:39:22.988751 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmzlt\" (UniqueName: \"kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:23 crc kubenswrapper[4755]: I0317 00:39:23.056499 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:23 crc kubenswrapper[4755]: I0317 00:39:23.534722 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk"] Mar 17 00:39:23 crc kubenswrapper[4755]: I0317 00:39:23.703156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" event={"ID":"92740d40-a460-41c9-9f94-eaac2999c3f7","Type":"ContainerStarted","Data":"96189661df6ce2cfad222f14067359bffabb5d31e10f2955c17168a0fa765587"} Mar 17 00:39:24 crc kubenswrapper[4755]: I0317 00:39:24.713392 4755 generic.go:334] "Generic (PLEG): container finished" podID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerID="3c22b8ebfe1020dc23a38eaf00b9e19de722ff07e301a9c51a81df6528b87620" exitCode=0 Mar 17 00:39:24 crc kubenswrapper[4755]: I0317 00:39:24.713473 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" event={"ID":"92740d40-a460-41c9-9f94-eaac2999c3f7","Type":"ContainerDied","Data":"3c22b8ebfe1020dc23a38eaf00b9e19de722ff07e301a9c51a81df6528b87620"} Mar 17 00:39:26 crc kubenswrapper[4755]: I0317 00:39:26.732285 4755 generic.go:334] "Generic (PLEG): container finished" podID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerID="3b442b2cf1df7b3ef9d9e1d5ce67cd581b3915909331b9a982cb7428953ed2b9" exitCode=0 Mar 17 00:39:26 crc kubenswrapper[4755]: I0317 00:39:26.732357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" event={"ID":"92740d40-a460-41c9-9f94-eaac2999c3f7","Type":"ContainerDied","Data":"3b442b2cf1df7b3ef9d9e1d5ce67cd581b3915909331b9a982cb7428953ed2b9"} Mar 17 00:39:27 crc kubenswrapper[4755]: I0317 00:39:27.742007 4755 generic.go:334] "Generic (PLEG): container finished" podID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerID="e3ef81906597f5c24c60ecbd536e9c14e2956dca7606c38f3289671c010a5b9c" exitCode=0 Mar 17 00:39:27 crc kubenswrapper[4755]: I0317 00:39:27.742073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" event={"ID":"92740d40-a460-41c9-9f94-eaac2999c3f7","Type":"ContainerDied","Data":"e3ef81906597f5c24c60ecbd536e9c14e2956dca7606c38f3289671c010a5b9c"} Mar 17 00:39:28 crc kubenswrapper[4755]: I0317 00:39:28.665924 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:39:28 crc kubenswrapper[4755]: I0317 00:39:28.666022 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.166807 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.364103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util\") pod \"92740d40-a460-41c9-9f94-eaac2999c3f7\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.364324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle\") pod \"92740d40-a460-41c9-9f94-eaac2999c3f7\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.364375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmzlt\" (UniqueName: \"kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt\") pod \"92740d40-a460-41c9-9f94-eaac2999c3f7\" (UID: \"92740d40-a460-41c9-9f94-eaac2999c3f7\") " Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.367090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle" (OuterVolumeSpecName: "bundle") pod "92740d40-a460-41c9-9f94-eaac2999c3f7" (UID: "92740d40-a460-41c9-9f94-eaac2999c3f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.374620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt" (OuterVolumeSpecName: "kube-api-access-vmzlt") pod "92740d40-a460-41c9-9f94-eaac2999c3f7" (UID: "92740d40-a460-41c9-9f94-eaac2999c3f7"). InnerVolumeSpecName "kube-api-access-vmzlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.395614 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util" (OuterVolumeSpecName: "util") pod "92740d40-a460-41c9-9f94-eaac2999c3f7" (UID: "92740d40-a460-41c9-9f94-eaac2999c3f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.467289 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.467356 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92740d40-a460-41c9-9f94-eaac2999c3f7-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.467380 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmzlt\" (UniqueName: \"kubernetes.io/projected/92740d40-a460-41c9-9f94-eaac2999c3f7-kube-api-access-vmzlt\") on node \"crc\" DevicePath \"\"" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.763410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" event={"ID":"92740d40-a460-41c9-9f94-eaac2999c3f7","Type":"ContainerDied","Data":"96189661df6ce2cfad222f14067359bffabb5d31e10f2955c17168a0fa765587"} Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.763510 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96189661df6ce2cfad222f14067359bffabb5d31e10f2955c17168a0fa765587" Mar 17 00:39:29 crc kubenswrapper[4755]: I0317 00:39:29.763547 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.415078 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx"] Mar 17 00:39:41 crc kubenswrapper[4755]: E0317 00:39:41.416339 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="util" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.416357 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="util" Mar 17 00:39:41 crc kubenswrapper[4755]: E0317 00:39:41.416375 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="pull" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.416388 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="pull" Mar 17 00:39:41 crc kubenswrapper[4755]: E0317 00:39:41.416421 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="extract" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.416460 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="extract" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.416714 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="92740d40-a460-41c9-9f94-eaac2999c3f7" containerName="extract" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.417821 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.424418 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4xtd6" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.424640 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.424757 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.424948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.425009 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.444940 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx"] Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.471624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-webhook-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.471690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhb87\" (UniqueName: \"kubernetes.io/projected/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-kube-api-access-dhb87\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.471715 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-apiservice-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.572610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-webhook-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.572681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhb87\" (UniqueName: \"kubernetes.io/projected/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-kube-api-access-dhb87\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.572706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-apiservice-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.585099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-apiservice-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.585129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-webhook-cert\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.587773 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhb87\" (UniqueName: \"kubernetes.io/projected/4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9-kube-api-access-dhb87\") pod \"metallb-operator-controller-manager-7fc56c47d5-cmtzx\" (UID: \"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9\") " pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.762384 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-578dbc6b-286wk"] Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.763414 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.765600 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.765877 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.766143 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hjr7x" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.771654 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.815452 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-578dbc6b-286wk"] Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.882666 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-webhook-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.882740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdm8g\" (UniqueName: \"kubernetes.io/projected/356ac706-4ec4-49b8-b270-6c8fa35b7d72-kube-api-access-gdm8g\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.882805 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-apiservice-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.988595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-apiservice-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.988662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-webhook-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.988711 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdm8g\" (UniqueName: \"kubernetes.io/projected/356ac706-4ec4-49b8-b270-6c8fa35b7d72-kube-api-access-gdm8g\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:41 crc kubenswrapper[4755]: I0317 00:39:41.994111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-webhook-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.011161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/356ac706-4ec4-49b8-b270-6c8fa35b7d72-apiservice-cert\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.017065 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdm8g\" (UniqueName: \"kubernetes.io/projected/356ac706-4ec4-49b8-b270-6c8fa35b7d72-kube-api-access-gdm8g\") pod \"metallb-operator-webhook-server-578dbc6b-286wk\" (UID: \"356ac706-4ec4-49b8-b270-6c8fa35b7d72\") " pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.140360 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.282907 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx"] Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.621811 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-578dbc6b-286wk"] Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.848896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" event={"ID":"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9","Type":"ContainerStarted","Data":"59537095a174c3c1f397ba266b884bf3b80e034245d6dff3db2c09494b1ed550"} Mar 17 00:39:42 crc kubenswrapper[4755]: I0317 00:39:42.850227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" event={"ID":"356ac706-4ec4-49b8-b270-6c8fa35b7d72","Type":"ContainerStarted","Data":"3bad9171ba5afa94d56cac30ebf9a1536f21be1e81fc2f4ee8eab277864274aa"} Mar 17 00:39:48 crc kubenswrapper[4755]: I0317 00:39:48.897243 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" event={"ID":"4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9","Type":"ContainerStarted","Data":"4c2f586d69e730fd32353b2da032a75398041ec8a13eea535acafcb09af015bc"} Mar 17 00:39:48 crc kubenswrapper[4755]: I0317 00:39:48.897951 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:39:48 crc kubenswrapper[4755]: I0317 00:39:48.899146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" event={"ID":"356ac706-4ec4-49b8-b270-6c8fa35b7d72","Type":"ContainerStarted","Data":"1c62fcae90c155914275bc131334d8bc61e8950276bea12a4d4d96db5670e59a"} Mar 17 00:39:48 crc kubenswrapper[4755]: I0317 00:39:48.899528 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:39:48 crc kubenswrapper[4755]: I0317 00:39:48.932971 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" podStartSLOduration=2.434222293 podStartE2EDuration="7.932945398s" podCreationTimestamp="2026-03-17 00:39:41 +0000 UTC" firstStartedPulling="2026-03-17 00:39:42.291550338 +0000 UTC m=+1057.051002621" lastFinishedPulling="2026-03-17 00:39:47.790273443 +0000 UTC m=+1062.549725726" observedRunningTime="2026-03-17 00:39:48.93188784 +0000 UTC m=+1063.691340163" watchObservedRunningTime="2026-03-17 00:39:48.932945398 +0000 UTC m=+1063.692397691" Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.664666 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.665220 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.665267 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.665933 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.665995 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0" gracePeriod=600 Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.965141 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0" exitCode=0 Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.965197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0"} Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.965483 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5"} Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.965510 4755 scope.go:117] "RemoveContainer" containerID="0cab6d0e05377e82717b632b632ef0344a29e598068da9d56d70ae0349c0c4d1" Mar 17 00:39:58 crc kubenswrapper[4755]: I0317 00:39:58.993264 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" podStartSLOduration=12.827414799 podStartE2EDuration="17.993237191s" podCreationTimestamp="2026-03-17 00:39:41 +0000 UTC" firstStartedPulling="2026-03-17 00:39:42.627534533 +0000 UTC m=+1057.386986826" lastFinishedPulling="2026-03-17 00:39:47.793356935 +0000 UTC m=+1062.552809218" observedRunningTime="2026-03-17 00:39:48.962255519 +0000 UTC m=+1063.721707822" watchObservedRunningTime="2026-03-17 00:39:58.993237191 +0000 UTC m=+1073.752689514" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.136995 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561800-mktcc"] Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.138715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.140684 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.140748 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.142246 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.147409 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561800-mktcc"] Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.251967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56pp\" (UniqueName: \"kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp\") pod \"auto-csr-approver-29561800-mktcc\" (UID: \"24671c08-dfdb-4659-836d-83cec2bbbbb8\") " pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.354233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56pp\" (UniqueName: \"kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp\") pod \"auto-csr-approver-29561800-mktcc\" (UID: \"24671c08-dfdb-4659-836d-83cec2bbbbb8\") " pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.398275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56pp\" (UniqueName: \"kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp\") pod \"auto-csr-approver-29561800-mktcc\" (UID: \"24671c08-dfdb-4659-836d-83cec2bbbbb8\") " pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.459195 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.882125 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561800-mktcc"] Mar 17 00:40:00 crc kubenswrapper[4755]: W0317 00:40:00.887483 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24671c08_dfdb_4659_836d_83cec2bbbbb8.slice/crio-1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb WatchSource:0}: Error finding container 1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb: Status 404 returned error can't find the container with id 1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb Mar 17 00:40:00 crc kubenswrapper[4755]: I0317 00:40:00.987458 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561800-mktcc" event={"ID":"24671c08-dfdb-4659-836d-83cec2bbbbb8","Type":"ContainerStarted","Data":"1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb"} Mar 17 00:40:02 crc kubenswrapper[4755]: I0317 00:40:02.167739 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-578dbc6b-286wk" Mar 17 00:40:03 crc kubenswrapper[4755]: I0317 00:40:03.003749 4755 generic.go:334] "Generic (PLEG): container finished" podID="24671c08-dfdb-4659-836d-83cec2bbbbb8" containerID="4fd19af9daad9393a95e7201d9b24a025c08fa86d11bf77a7e9605b0cb2566a7" exitCode=0 Mar 17 00:40:03 crc kubenswrapper[4755]: I0317 00:40:03.003819 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561800-mktcc" event={"ID":"24671c08-dfdb-4659-836d-83cec2bbbbb8","Type":"ContainerDied","Data":"4fd19af9daad9393a95e7201d9b24a025c08fa86d11bf77a7e9605b0cb2566a7"} Mar 17 00:40:04 crc kubenswrapper[4755]: I0317 00:40:04.314590 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:04 crc kubenswrapper[4755]: I0317 00:40:04.413896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56pp\" (UniqueName: \"kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp\") pod \"24671c08-dfdb-4659-836d-83cec2bbbbb8\" (UID: \"24671c08-dfdb-4659-836d-83cec2bbbbb8\") " Mar 17 00:40:04 crc kubenswrapper[4755]: I0317 00:40:04.418976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp" (OuterVolumeSpecName: "kube-api-access-w56pp") pod "24671c08-dfdb-4659-836d-83cec2bbbbb8" (UID: "24671c08-dfdb-4659-836d-83cec2bbbbb8"). InnerVolumeSpecName "kube-api-access-w56pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:40:04 crc kubenswrapper[4755]: I0317 00:40:04.516390 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56pp\" (UniqueName: \"kubernetes.io/projected/24671c08-dfdb-4659-836d-83cec2bbbbb8-kube-api-access-w56pp\") on node \"crc\" DevicePath \"\"" Mar 17 00:40:05 crc kubenswrapper[4755]: I0317 00:40:05.021116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561800-mktcc" event={"ID":"24671c08-dfdb-4659-836d-83cec2bbbbb8","Type":"ContainerDied","Data":"1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb"} Mar 17 00:40:05 crc kubenswrapper[4755]: I0317 00:40:05.021638 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1181ef0a22d36affcb3774d341c2aa2c4e246141a724903b2271630f1d1facdb" Mar 17 00:40:05 crc kubenswrapper[4755]: I0317 00:40:05.021220 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561800-mktcc" Mar 17 00:40:05 crc kubenswrapper[4755]: I0317 00:40:05.370773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561794-8fnz7"] Mar 17 00:40:05 crc kubenswrapper[4755]: I0317 00:40:05.379473 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561794-8fnz7"] Mar 17 00:40:06 crc kubenswrapper[4755]: I0317 00:40:06.271506 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a3a05f-68ae-4201-b5aa-051d129d70fd" path="/var/lib/kubelet/pods/48a3a05f-68ae-4201-b5aa-051d129d70fd/volumes" Mar 17 00:40:21 crc kubenswrapper[4755]: I0317 00:40:21.769914 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7fc56c47d5-cmtzx" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.662904 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-58c5n"] Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.663396 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24671c08-dfdb-4659-836d-83cec2bbbbb8" containerName="oc" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.663416 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24671c08-dfdb-4659-836d-83cec2bbbbb8" containerName="oc" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.663604 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="24671c08-dfdb-4659-836d-83cec2bbbbb8" containerName="oc" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.666503 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.668332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ggjkk" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.669044 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.669125 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.690366 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl"] Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691110 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691419 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-conf\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-sockets\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-reloader\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691659 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691682 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb58\" (UniqueName: \"kubernetes.io/projected/5b9e844c-82cd-4654-9fc4-cf5eb901df30-kube-api-access-rhb58\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-startup\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.691841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.692326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.711001 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl"] Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.757105 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ct56p"] Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.758099 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.759625 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.759655 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.759828 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vlmvz" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.760789 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.765136 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-hfls9"] Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.766212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.767423 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.783371 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hfls9"] Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zzxb\" (UniqueName: \"kubernetes.io/projected/7a47b75a-e6b5-493f-9ec6-8843b2724a32-kube-api-access-8zzxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb58\" (UniqueName: \"kubernetes.io/projected/5b9e844c-82cd-4654-9fc4-cf5eb901df30-kube-api-access-rhb58\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796843 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metallb-excludel2\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpxj\" (UniqueName: \"kubernetes.io/projected/07bf4fdd-648b-425f-8b00-7ad303c2b77f-kube-api-access-zhpxj\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.796923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-startup\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.796995 4755 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.797082 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs podName:5b9e844c-82cd-4654-9fc4-cf5eb901df30 nodeName:}" failed. No retries permitted until 2026-03-17 00:40:23.297063086 +0000 UTC m=+1098.056515369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs") pod "frr-k8s-58c5n" (UID: "5b9e844c-82cd-4654-9fc4-cf5eb901df30") : secret "frr-k8s-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797433 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457x8\" (UniqueName: \"kubernetes.io/projected/c2bae47d-2436-490a-8998-6d1f1c59ff6d-kube-api-access-457x8\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797546 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797568 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797596 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-conf\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.797638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-sockets\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.798209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.798231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-reloader\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.798314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-cert\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.798538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-startup\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.800140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-conf\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.800321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-reloader\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.801985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5b9e844c-82cd-4654-9fc4-cf5eb901df30-frr-sockets\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.822581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb58\" (UniqueName: \"kubernetes.io/projected/5b9e844c-82cd-4654-9fc4-cf5eb901df30-kube-api-access-rhb58\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-cert\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zzxb\" (UniqueName: \"kubernetes.io/projected/7a47b75a-e6b5-493f-9ec6-8843b2724a32-kube-api-access-8zzxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metallb-excludel2\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpxj\" (UniqueName: \"kubernetes.io/projected/07bf4fdd-648b-425f-8b00-7ad303c2b77f-kube-api-access-zhpxj\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899744 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899780 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457x8\" (UniqueName: \"kubernetes.io/projected/c2bae47d-2436-490a-8998-6d1f1c59ff6d-kube-api-access-457x8\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899823 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.899839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899867 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899885 4755 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899919 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist podName:07bf4fdd-648b-425f-8b00-7ad303c2b77f nodeName:}" failed. No retries permitted until 2026-03-17 00:40:23.399904737 +0000 UTC m=+1098.159357020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist") pod "speaker-ct56p" (UID: "07bf4fdd-648b-425f-8b00-7ad303c2b77f") : secret "metallb-memberlist" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899940 4755 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899944 4755 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899967 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert podName:7a47b75a-e6b5-493f-9ec6-8843b2724a32 nodeName:}" failed. No retries permitted until 2026-03-17 00:40:23.399945719 +0000 UTC m=+1098.159398112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert") pod "frr-k8s-webhook-server-bcc4b6f68-gv4fl" (UID: "7a47b75a-e6b5-493f-9ec6-8843b2724a32") : secret "frr-k8s-webhook-server-cert" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.899994 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs podName:07bf4fdd-648b-425f-8b00-7ad303c2b77f nodeName:}" failed. No retries permitted until 2026-03-17 00:40:23.399978329 +0000 UTC m=+1098.159430732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs") pod "speaker-ct56p" (UID: "07bf4fdd-648b-425f-8b00-7ad303c2b77f") : secret "speaker-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: E0317 00:40:22.900014 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs podName:c2bae47d-2436-490a-8998-6d1f1c59ff6d nodeName:}" failed. No retries permitted until 2026-03-17 00:40:23.40000611 +0000 UTC m=+1098.159458543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs") pod "controller-7bb4cc7c98-hfls9" (UID: "c2bae47d-2436-490a-8998-6d1f1c59ff6d") : secret "controller-certs-secret" not found Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.900741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metallb-excludel2\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.901909 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.915368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-cert\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.915866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpxj\" (UniqueName: \"kubernetes.io/projected/07bf4fdd-648b-425f-8b00-7ad303c2b77f-kube-api-access-zhpxj\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.918921 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zzxb\" (UniqueName: \"kubernetes.io/projected/7a47b75a-e6b5-493f-9ec6-8843b2724a32-kube-api-access-8zzxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:22 crc kubenswrapper[4755]: I0317 00:40:22.924139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457x8\" (UniqueName: \"kubernetes.io/projected/c2bae47d-2436-490a-8998-6d1f1c59ff6d-kube-api-access-457x8\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.150834 4755 scope.go:117] "RemoveContainer" containerID="db55da9b29e5486aa78520af96ac7dda3f64cf4fd57482e95c39d81714c01ac9" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.304746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.309406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b9e844c-82cd-4654-9fc4-cf5eb901df30-metrics-certs\") pod \"frr-k8s-58c5n\" (UID: \"5b9e844c-82cd-4654-9fc4-cf5eb901df30\") " pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.405678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.405741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.405777 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.405807 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:23 crc kubenswrapper[4755]: E0317 00:40:23.406593 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 00:40:23 crc kubenswrapper[4755]: E0317 00:40:23.406824 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist podName:07bf4fdd-648b-425f-8b00-7ad303c2b77f nodeName:}" failed. No retries permitted until 2026-03-17 00:40:24.406778497 +0000 UTC m=+1099.166230790 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist") pod "speaker-ct56p" (UID: "07bf4fdd-648b-425f-8b00-7ad303c2b77f") : secret "metallb-memberlist" not found Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.409930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-metrics-certs\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.410875 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2bae47d-2436-490a-8998-6d1f1c59ff6d-metrics-certs\") pod \"controller-7bb4cc7c98-hfls9\" (UID: \"c2bae47d-2436-490a-8998-6d1f1c59ff6d\") " pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.411111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a47b75a-e6b5-493f-9ec6-8843b2724a32-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gv4fl\" (UID: \"7a47b75a-e6b5-493f-9ec6-8843b2724a32\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.589018 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.606429 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.682085 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.758980 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:40:23 crc kubenswrapper[4755]: I0317 00:40:23.993247 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hfls9"] Mar 17 00:40:23 crc kubenswrapper[4755]: W0317 00:40:23.997149 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2bae47d_2436_490a_8998_6d1f1c59ff6d.slice/crio-bc400c79ee656641b8658a12e9062372284e71c73d938b4f9839ac317bc8b3e7 WatchSource:0}: Error finding container bc400c79ee656641b8658a12e9062372284e71c73d938b4f9839ac317bc8b3e7: Status 404 returned error can't find the container with id bc400c79ee656641b8658a12e9062372284e71c73d938b4f9839ac317bc8b3e7 Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.107841 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl"] Mar 17 00:40:24 crc kubenswrapper[4755]: W0317 00:40:24.114767 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a47b75a_e6b5_493f_9ec6_8843b2724a32.slice/crio-1a49ce9b088034bed747c1d573d705637b108e2a19847aff4299d1b47d9c4557 WatchSource:0}: Error finding container 1a49ce9b088034bed747c1d573d705637b108e2a19847aff4299d1b47d9c4557: Status 404 returned error can't find the container with id 1a49ce9b088034bed747c1d573d705637b108e2a19847aff4299d1b47d9c4557 Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.180927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hfls9" event={"ID":"c2bae47d-2436-490a-8998-6d1f1c59ff6d","Type":"ContainerStarted","Data":"048cc1a51a69bc49305d0ece65963a47b0386f6320b85a474be1dc044762b9c8"} Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.180983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hfls9" event={"ID":"c2bae47d-2436-490a-8998-6d1f1c59ff6d","Type":"ContainerStarted","Data":"bc400c79ee656641b8658a12e9062372284e71c73d938b4f9839ac317bc8b3e7"} Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.182303 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" event={"ID":"7a47b75a-e6b5-493f-9ec6-8843b2724a32","Type":"ContainerStarted","Data":"1a49ce9b088034bed747c1d573d705637b108e2a19847aff4299d1b47d9c4557"} Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.183261 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"3d4dcaa6cb853b0786a56aa9fac1618b29ac75db1fd760512cb780ceef4a90f2"} Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.422040 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.431185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07bf4fdd-648b-425f-8b00-7ad303c2b77f-memberlist\") pod \"speaker-ct56p\" (UID: \"07bf4fdd-648b-425f-8b00-7ad303c2b77f\") " pod="metallb-system/speaker-ct56p" Mar 17 00:40:24 crc kubenswrapper[4755]: I0317 00:40:24.570653 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ct56p" Mar 17 00:40:24 crc kubenswrapper[4755]: W0317 00:40:24.597241 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07bf4fdd_648b_425f_8b00_7ad303c2b77f.slice/crio-d9b56ad7f8ec58ae0200d724244084b333397345edbeb6f5001f0fdb7ef19417 WatchSource:0}: Error finding container d9b56ad7f8ec58ae0200d724244084b333397345edbeb6f5001f0fdb7ef19417: Status 404 returned error can't find the container with id d9b56ad7f8ec58ae0200d724244084b333397345edbeb6f5001f0fdb7ef19417 Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.195333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hfls9" event={"ID":"c2bae47d-2436-490a-8998-6d1f1c59ff6d","Type":"ContainerStarted","Data":"25a165eb83607382e22be1cc6055963d5435c83fff886cf5d72feb61849af341"} Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.195785 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.196998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ct56p" event={"ID":"07bf4fdd-648b-425f-8b00-7ad303c2b77f","Type":"ContainerStarted","Data":"d021e9814501bc9c371af32c0848fc3aa7440e036254586b3a3a0b7fa2455f5d"} Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.197036 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ct56p" event={"ID":"07bf4fdd-648b-425f-8b00-7ad303c2b77f","Type":"ContainerStarted","Data":"888763b54e5ddf128e2e5f902520d980c5f54f28383935a7bfb8cf35d3363be4"} Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.197045 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ct56p" event={"ID":"07bf4fdd-648b-425f-8b00-7ad303c2b77f","Type":"ContainerStarted","Data":"d9b56ad7f8ec58ae0200d724244084b333397345edbeb6f5001f0fdb7ef19417"} Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.197308 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ct56p" Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.220430 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-hfls9" podStartSLOduration=3.220413645 podStartE2EDuration="3.220413645s" podCreationTimestamp="2026-03-17 00:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:40:25.218877974 +0000 UTC m=+1099.978330267" watchObservedRunningTime="2026-03-17 00:40:25.220413645 +0000 UTC m=+1099.979865928" Mar 17 00:40:25 crc kubenswrapper[4755]: I0317 00:40:25.240639 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ct56p" podStartSLOduration=3.240624523 podStartE2EDuration="3.240624523s" podCreationTimestamp="2026-03-17 00:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:40:25.234083659 +0000 UTC m=+1099.993535932" watchObservedRunningTime="2026-03-17 00:40:25.240624523 +0000 UTC m=+1100.000076806" Mar 17 00:40:32 crc kubenswrapper[4755]: I0317 00:40:32.264455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" event={"ID":"7a47b75a-e6b5-493f-9ec6-8843b2724a32","Type":"ContainerStarted","Data":"23a6510ef1ba29e4aa327c481432afd6f079e3cd1c775a5446b898af9e89ae63"} Mar 17 00:40:32 crc kubenswrapper[4755]: I0317 00:40:32.265121 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:32 crc kubenswrapper[4755]: I0317 00:40:32.267068 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b9e844c-82cd-4654-9fc4-cf5eb901df30" containerID="f1ee8db1280453a18c31ee7cd22347ecbce13c05002ea1f1f089076fa6b430b0" exitCode=0 Mar 17 00:40:32 crc kubenswrapper[4755]: I0317 00:40:32.267109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerDied","Data":"f1ee8db1280453a18c31ee7cd22347ecbce13c05002ea1f1f089076fa6b430b0"} Mar 17 00:40:32 crc kubenswrapper[4755]: I0317 00:40:32.317630 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" podStartSLOduration=3.00165948 podStartE2EDuration="10.31760746s" podCreationTimestamp="2026-03-17 00:40:22 +0000 UTC" firstStartedPulling="2026-03-17 00:40:24.117323564 +0000 UTC m=+1098.876775847" lastFinishedPulling="2026-03-17 00:40:31.433271544 +0000 UTC m=+1106.192723827" observedRunningTime="2026-03-17 00:40:32.286801452 +0000 UTC m=+1107.046253755" watchObservedRunningTime="2026-03-17 00:40:32.31760746 +0000 UTC m=+1107.077059783" Mar 17 00:40:33 crc kubenswrapper[4755]: I0317 00:40:33.277708 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b9e844c-82cd-4654-9fc4-cf5eb901df30" containerID="804e4d53fae777fc1bfc54e0828d93b693a17b6f7b2f1c829ab61b2d8dc8c773" exitCode=0 Mar 17 00:40:33 crc kubenswrapper[4755]: I0317 00:40:33.277793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerDied","Data":"804e4d53fae777fc1bfc54e0828d93b693a17b6f7b2f1c829ab61b2d8dc8c773"} Mar 17 00:40:34 crc kubenswrapper[4755]: I0317 00:40:34.290885 4755 generic.go:334] "Generic (PLEG): container finished" podID="5b9e844c-82cd-4654-9fc4-cf5eb901df30" containerID="444b94cefe1a8228efa2306f7755773a9533b63ad0635e281cc01e8d29d2560f" exitCode=0 Mar 17 00:40:34 crc kubenswrapper[4755]: I0317 00:40:34.291008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerDied","Data":"444b94cefe1a8228efa2306f7755773a9533b63ad0635e281cc01e8d29d2560f"} Mar 17 00:40:34 crc kubenswrapper[4755]: I0317 00:40:34.579843 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ct56p" Mar 17 00:40:35 crc kubenswrapper[4755]: I0317 00:40:35.308636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"dfb300020f4df0474bf765de3090f88f53a23f06061a3d884f607ef7085816e0"} Mar 17 00:40:35 crc kubenswrapper[4755]: I0317 00:40:35.308682 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"d338f3aa790721eedb92c8af0800108f0e7582f4293515c051e7252d60eda59b"} Mar 17 00:40:35 crc kubenswrapper[4755]: I0317 00:40:35.308696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"a1daadd00b1571732ebb9734b83653429400169b40f83fbd87fc45f5f837256c"} Mar 17 00:40:35 crc kubenswrapper[4755]: I0317 00:40:35.308711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"62537af13c64f4c1495e742e8c57cc5e95c0c6d14e4c7afed14f7de2bb591777"} Mar 17 00:40:35 crc kubenswrapper[4755]: I0317 00:40:35.308724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"e19668cee7446b2282d48129f8fcb1fc2b73ea6bf99b0966b1a9fec60deca2bc"} Mar 17 00:40:36 crc kubenswrapper[4755]: I0317 00:40:36.322676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-58c5n" event={"ID":"5b9e844c-82cd-4654-9fc4-cf5eb901df30","Type":"ContainerStarted","Data":"11afc1ef9eee5c7bd37b83c3e06ac042e6e87cad6dec79ac233a5b6ffaf6f19d"} Mar 17 00:40:36 crc kubenswrapper[4755]: I0317 00:40:36.323365 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:36 crc kubenswrapper[4755]: I0317 00:40:36.355376 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-58c5n" podStartSLOduration=6.701086221 podStartE2EDuration="14.355355034s" podCreationTimestamp="2026-03-17 00:40:22 +0000 UTC" firstStartedPulling="2026-03-17 00:40:23.758772058 +0000 UTC m=+1098.518224341" lastFinishedPulling="2026-03-17 00:40:31.413040871 +0000 UTC m=+1106.172493154" observedRunningTime="2026-03-17 00:40:36.347707041 +0000 UTC m=+1111.107159354" watchObservedRunningTime="2026-03-17 00:40:36.355355034 +0000 UTC m=+1111.114807337" Mar 17 00:40:38 crc kubenswrapper[4755]: I0317 00:40:38.589645 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:38 crc kubenswrapper[4755]: I0317 00:40:38.652638 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.783142 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h4b7d"] Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.784275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.787076 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.787207 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9bzlc" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.787377 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.793169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h4b7d"] Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.812771 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhhm\" (UniqueName: \"kubernetes.io/projected/d22580ef-38c7-4b1a-95a3-c6a7507ba05a-kube-api-access-gdhhm\") pod \"openstack-operator-index-h4b7d\" (UID: \"d22580ef-38c7-4b1a-95a3-c6a7507ba05a\") " pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.913759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhhm\" (UniqueName: \"kubernetes.io/projected/d22580ef-38c7-4b1a-95a3-c6a7507ba05a-kube-api-access-gdhhm\") pod \"openstack-operator-index-h4b7d\" (UID: \"d22580ef-38c7-4b1a-95a3-c6a7507ba05a\") " pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:40 crc kubenswrapper[4755]: I0317 00:40:40.936802 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhhm\" (UniqueName: \"kubernetes.io/projected/d22580ef-38c7-4b1a-95a3-c6a7507ba05a-kube-api-access-gdhhm\") pod \"openstack-operator-index-h4b7d\" (UID: \"d22580ef-38c7-4b1a-95a3-c6a7507ba05a\") " pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:41 crc kubenswrapper[4755]: I0317 00:40:41.101503 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:41 crc kubenswrapper[4755]: I0317 00:40:41.582366 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h4b7d"] Mar 17 00:40:41 crc kubenswrapper[4755]: W0317 00:40:41.584583 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd22580ef_38c7_4b1a_95a3_c6a7507ba05a.slice/crio-754704c11bc6cfba521afd3d4850ea86c53d8bdae3b2fa163aaa469f9f48a10e WatchSource:0}: Error finding container 754704c11bc6cfba521afd3d4850ea86c53d8bdae3b2fa163aaa469f9f48a10e: Status 404 returned error can't find the container with id 754704c11bc6cfba521afd3d4850ea86c53d8bdae3b2fa163aaa469f9f48a10e Mar 17 00:40:42 crc kubenswrapper[4755]: I0317 00:40:42.376565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4b7d" event={"ID":"d22580ef-38c7-4b1a-95a3-c6a7507ba05a","Type":"ContainerStarted","Data":"754704c11bc6cfba521afd3d4850ea86c53d8bdae3b2fa163aaa469f9f48a10e"} Mar 17 00:40:43 crc kubenswrapper[4755]: I0317 00:40:43.618403 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gv4fl" Mar 17 00:40:43 crc kubenswrapper[4755]: I0317 00:40:43.693607 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-hfls9" Mar 17 00:40:44 crc kubenswrapper[4755]: I0317 00:40:44.394647 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h4b7d" event={"ID":"d22580ef-38c7-4b1a-95a3-c6a7507ba05a","Type":"ContainerStarted","Data":"1867d8172632129d5e074558e039c6cc432fd4bd8149d46444d503d1122450b0"} Mar 17 00:40:44 crc kubenswrapper[4755]: I0317 00:40:44.415238 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h4b7d" podStartSLOduration=1.994322895 podStartE2EDuration="4.415215676s" podCreationTimestamp="2026-03-17 00:40:40 +0000 UTC" firstStartedPulling="2026-03-17 00:40:41.588286764 +0000 UTC m=+1116.347739047" lastFinishedPulling="2026-03-17 00:40:44.009179545 +0000 UTC m=+1118.768631828" observedRunningTime="2026-03-17 00:40:44.40780126 +0000 UTC m=+1119.167253553" watchObservedRunningTime="2026-03-17 00:40:44.415215676 +0000 UTC m=+1119.174667969" Mar 17 00:40:51 crc kubenswrapper[4755]: I0317 00:40:51.102609 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:51 crc kubenswrapper[4755]: I0317 00:40:51.103640 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:51 crc kubenswrapper[4755]: I0317 00:40:51.146683 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:51 crc kubenswrapper[4755]: I0317 00:40:51.485514 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-h4b7d" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.223397 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q"] Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.226148 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.228992 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-spw8d" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.240594 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q"] Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.246412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.246663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.246822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmtr\" (UniqueName: \"kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.348161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.348256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmtr\" (UniqueName: \"kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.348288 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.348807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.349182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.369355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmtr\" (UniqueName: \"kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr\") pod \"8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.555583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:53 crc kubenswrapper[4755]: I0317 00:40:53.595003 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-58c5n" Mar 17 00:40:54 crc kubenswrapper[4755]: I0317 00:40:54.047127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q"] Mar 17 00:40:54 crc kubenswrapper[4755]: W0317 00:40:54.055598 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27588406_a74c_454c_84be_38da41fe4737.slice/crio-6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d WatchSource:0}: Error finding container 6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d: Status 404 returned error can't find the container with id 6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d Mar 17 00:40:54 crc kubenswrapper[4755]: I0317 00:40:54.479583 4755 generic.go:334] "Generic (PLEG): container finished" podID="27588406-a74c-454c-84be-38da41fe4737" containerID="e6ee65f3e6aeae3e6c9ba0ef7823a338b76759541ae14e1b9483c7dd99726299" exitCode=0 Mar 17 00:40:54 crc kubenswrapper[4755]: I0317 00:40:54.479627 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" event={"ID":"27588406-a74c-454c-84be-38da41fe4737","Type":"ContainerDied","Data":"e6ee65f3e6aeae3e6c9ba0ef7823a338b76759541ae14e1b9483c7dd99726299"} Mar 17 00:40:54 crc kubenswrapper[4755]: I0317 00:40:54.479655 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" event={"ID":"27588406-a74c-454c-84be-38da41fe4737","Type":"ContainerStarted","Data":"6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d"} Mar 17 00:40:55 crc kubenswrapper[4755]: I0317 00:40:55.489858 4755 generic.go:334] "Generic (PLEG): container finished" podID="27588406-a74c-454c-84be-38da41fe4737" containerID="287aa946103c9604cf068683bc244dcb6a0df24e61c7271c1401591bc1ecec27" exitCode=0 Mar 17 00:40:55 crc kubenswrapper[4755]: I0317 00:40:55.489916 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" event={"ID":"27588406-a74c-454c-84be-38da41fe4737","Type":"ContainerDied","Data":"287aa946103c9604cf068683bc244dcb6a0df24e61c7271c1401591bc1ecec27"} Mar 17 00:40:56 crc kubenswrapper[4755]: I0317 00:40:56.499295 4755 generic.go:334] "Generic (PLEG): container finished" podID="27588406-a74c-454c-84be-38da41fe4737" containerID="6c1aa42a9aa0933376ccce3dcd86780f3d9ced43ddc2c18c4017bd4f3792add2" exitCode=0 Mar 17 00:40:56 crc kubenswrapper[4755]: I0317 00:40:56.499595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" event={"ID":"27588406-a74c-454c-84be-38da41fe4737","Type":"ContainerDied","Data":"6c1aa42a9aa0933376ccce3dcd86780f3d9ced43ddc2c18c4017bd4f3792add2"} Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.776240 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.844382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle\") pod \"27588406-a74c-454c-84be-38da41fe4737\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.844568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmtr\" (UniqueName: \"kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr\") pod \"27588406-a74c-454c-84be-38da41fe4737\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.844946 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util\") pod \"27588406-a74c-454c-84be-38da41fe4737\" (UID: \"27588406-a74c-454c-84be-38da41fe4737\") " Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.846042 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle" (OuterVolumeSpecName: "bundle") pod "27588406-a74c-454c-84be-38da41fe4737" (UID: "27588406-a74c-454c-84be-38da41fe4737"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.849752 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr" (OuterVolumeSpecName: "kube-api-access-qwmtr") pod "27588406-a74c-454c-84be-38da41fe4737" (UID: "27588406-a74c-454c-84be-38da41fe4737"). InnerVolumeSpecName "kube-api-access-qwmtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.857971 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util" (OuterVolumeSpecName: "util") pod "27588406-a74c-454c-84be-38da41fe4737" (UID: "27588406-a74c-454c-84be-38da41fe4737"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.947593 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.947640 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwmtr\" (UniqueName: \"kubernetes.io/projected/27588406-a74c-454c-84be-38da41fe4737-kube-api-access-qwmtr\") on node \"crc\" DevicePath \"\"" Mar 17 00:40:57 crc kubenswrapper[4755]: I0317 00:40:57.947662 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27588406-a74c-454c-84be-38da41fe4737-util\") on node \"crc\" DevicePath \"\"" Mar 17 00:40:58 crc kubenswrapper[4755]: I0317 00:40:58.516876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" event={"ID":"27588406-a74c-454c-84be-38da41fe4737","Type":"ContainerDied","Data":"6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d"} Mar 17 00:40:58 crc kubenswrapper[4755]: I0317 00:40:58.516918 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f01fd326a1882e2f1e53b695e6c1748ca352f9a3ecee28bcf9121839ffa787d" Mar 17 00:40:58 crc kubenswrapper[4755]: I0317 00:40:58.516932 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.578586 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp"] Mar 17 00:41:01 crc kubenswrapper[4755]: E0317 00:41:01.579003 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="extract" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.579027 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="extract" Mar 17 00:41:01 crc kubenswrapper[4755]: E0317 00:41:01.579058 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="util" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.579069 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="util" Mar 17 00:41:01 crc kubenswrapper[4755]: E0317 00:41:01.579089 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="pull" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.579099 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="pull" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.579316 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="27588406-a74c-454c-84be-38da41fe4737" containerName="extract" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.580140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.582152 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-h58lf" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.596259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp"] Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.704005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvbr\" (UniqueName: \"kubernetes.io/projected/ca5fc922-63bc-4052-844e-96e4a60e7ed4-kube-api-access-4cvbr\") pod \"openstack-operator-controller-init-66cdd7cf4d-snfvp\" (UID: \"ca5fc922-63bc-4052-844e-96e4a60e7ed4\") " pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.805185 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvbr\" (UniqueName: \"kubernetes.io/projected/ca5fc922-63bc-4052-844e-96e4a60e7ed4-kube-api-access-4cvbr\") pod \"openstack-operator-controller-init-66cdd7cf4d-snfvp\" (UID: \"ca5fc922-63bc-4052-844e-96e4a60e7ed4\") " pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.820956 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvbr\" (UniqueName: \"kubernetes.io/projected/ca5fc922-63bc-4052-844e-96e4a60e7ed4-kube-api-access-4cvbr\") pod \"openstack-operator-controller-init-66cdd7cf4d-snfvp\" (UID: \"ca5fc922-63bc-4052-844e-96e4a60e7ed4\") " pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:01 crc kubenswrapper[4755]: I0317 00:41:01.900760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:02 crc kubenswrapper[4755]: I0317 00:41:02.399795 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp"] Mar 17 00:41:02 crc kubenswrapper[4755]: I0317 00:41:02.550235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" event={"ID":"ca5fc922-63bc-4052-844e-96e4a60e7ed4","Type":"ContainerStarted","Data":"a833444a4fb7e2554c0282089627e3026f326f022921e7bb12e14c1132b1c14b"} Mar 17 00:41:06 crc kubenswrapper[4755]: I0317 00:41:06.589617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" event={"ID":"ca5fc922-63bc-4052-844e-96e4a60e7ed4","Type":"ContainerStarted","Data":"343707e7c7ca92e1009e24f12a56d3988215fa9916bd7f5bbcb2fc30480244cb"} Mar 17 00:41:06 crc kubenswrapper[4755]: I0317 00:41:06.590149 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:06 crc kubenswrapper[4755]: I0317 00:41:06.629369 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" podStartSLOduration=2.019921009 podStartE2EDuration="5.629351101s" podCreationTimestamp="2026-03-17 00:41:01 +0000 UTC" firstStartedPulling="2026-03-17 00:41:02.410955844 +0000 UTC m=+1137.170408127" lastFinishedPulling="2026-03-17 00:41:06.020385926 +0000 UTC m=+1140.779838219" observedRunningTime="2026-03-17 00:41:06.620681168 +0000 UTC m=+1141.380133461" watchObservedRunningTime="2026-03-17 00:41:06.629351101 +0000 UTC m=+1141.388803384" Mar 17 00:41:11 crc kubenswrapper[4755]: I0317 00:41:11.904593 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-66cdd7cf4d-snfvp" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.760986 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.762639 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.765735 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lmwvd" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.766426 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.767364 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.769077 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wkdtp" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.781544 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.787051 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.793292 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.794227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.798345 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k2rn5" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.806603 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.817985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.818911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.822554 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gvgl4" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.828003 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.828957 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.830961 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ztbtf" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.856054 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.864721 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.865872 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.872969 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qx6nf" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.877727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6rnn\" (UniqueName: \"kubernetes.io/projected/b5e695a0-9a52-46f0-8aae-3a4353bb3345-kube-api-access-f6rnn\") pod \"cinder-operator-controller-manager-8d58dc466-c9nzt\" (UID: \"b5e695a0-9a52-46f0-8aae-3a4353bb3345\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.877791 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvktk\" (UniqueName: \"kubernetes.io/projected/3adfd998-aade-4343-8952-50b0eba8b510-kube-api-access-cvktk\") pod \"barbican-operator-controller-manager-59bc569d95-r84f9\" (UID: \"3adfd998-aade-4343-8952-50b0eba8b510\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.880276 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.907819 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.912303 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.913338 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.915814 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.918360 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xs8pf" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.920672 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.921700 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.923362 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nllvp" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.947777 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.955539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.973454 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.974527 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.978796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6rnn\" (UniqueName: \"kubernetes.io/projected/b5e695a0-9a52-46f0-8aae-3a4353bb3345-kube-api-access-f6rnn\") pod \"cinder-operator-controller-manager-8d58dc466-c9nzt\" (UID: \"b5e695a0-9a52-46f0-8aae-3a4353bb3345\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.978982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7vg\" (UniqueName: \"kubernetes.io/projected/00c5d701-8e74-44a0-9880-257001cb0062-kube-api-access-6t7vg\") pod \"designate-operator-controller-manager-588d4d986b-fmtx9\" (UID: \"00c5d701-8e74-44a0-9880-257001cb0062\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.979069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tx6\" (UniqueName: \"kubernetes.io/projected/a67ee100-af6d-492d-9a50-40fa8c59256b-kube-api-access-z6tx6\") pod \"horizon-operator-controller-manager-8464cc45fb-8sd9w\" (UID: \"a67ee100-af6d-492d-9a50-40fa8c59256b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.979143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgn8j\" (UniqueName: \"kubernetes.io/projected/13a9a76d-7b33-40eb-a7ec-5e5ff3c27705-kube-api-access-zgn8j\") pod \"heat-operator-controller-manager-67dd5f86f5-ztp6j\" (UID: \"13a9a76d-7b33-40eb-a7ec-5e5ff3c27705\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.979225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvktk\" (UniqueName: \"kubernetes.io/projected/3adfd998-aade-4343-8952-50b0eba8b510-kube-api-access-cvktk\") pod \"barbican-operator-controller-manager-59bc569d95-r84f9\" (UID: \"3adfd998-aade-4343-8952-50b0eba8b510\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.979298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97c4j\" (UniqueName: \"kubernetes.io/projected/e0809421-a91c-42c6-af2f-c8dc2ae7e856-kube-api-access-97c4j\") pod \"glance-operator-controller-manager-79df6bcc97-x7lgv\" (UID: \"e0809421-a91c-42c6-af2f-c8dc2ae7e856\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.982674 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x"] Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.983744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.985645 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s67wx" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.988683 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gc5jj" Mar 17 00:41:34 crc kubenswrapper[4755]: I0317 00:41:34.989040 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.000502 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.013860 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.015071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.017295 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.021283 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-v6778" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.024765 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvktk\" (UniqueName: \"kubernetes.io/projected/3adfd998-aade-4343-8952-50b0eba8b510-kube-api-access-cvktk\") pod \"barbican-operator-controller-manager-59bc569d95-r84f9\" (UID: \"3adfd998-aade-4343-8952-50b0eba8b510\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.024828 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-csd45"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.051183 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6rnn\" (UniqueName: \"kubernetes.io/projected/b5e695a0-9a52-46f0-8aae-3a4353bb3345-kube-api-access-f6rnn\") pod \"cinder-operator-controller-manager-8d58dc466-c9nzt\" (UID: \"b5e695a0-9a52-46f0-8aae-3a4353bb3345\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.052627 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.055861 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5xm4p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7vg\" (UniqueName: \"kubernetes.io/projected/00c5d701-8e74-44a0-9880-257001cb0062-kube-api-access-6t7vg\") pod \"designate-operator-controller-manager-588d4d986b-fmtx9\" (UID: \"00c5d701-8e74-44a0-9880-257001cb0062\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tx6\" (UniqueName: \"kubernetes.io/projected/a67ee100-af6d-492d-9a50-40fa8c59256b-kube-api-access-z6tx6\") pod \"horizon-operator-controller-manager-8464cc45fb-8sd9w\" (UID: \"a67ee100-af6d-492d-9a50-40fa8c59256b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgn8j\" (UniqueName: \"kubernetes.io/projected/13a9a76d-7b33-40eb-a7ec-5e5ff3c27705-kube-api-access-zgn8j\") pod \"heat-operator-controller-manager-67dd5f86f5-ztp6j\" (UID: \"13a9a76d-7b33-40eb-a7ec-5e5ff3c27705\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083827 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97c4j\" (UniqueName: \"kubernetes.io/projected/e0809421-a91c-42c6-af2f-c8dc2ae7e856-kube-api-access-97c4j\") pod \"glance-operator-controller-manager-79df6bcc97-x7lgv\" (UID: \"e0809421-a91c-42c6-af2f-c8dc2ae7e856\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083870 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlwh\" (UniqueName: \"kubernetes.io/projected/ec169260-a79f-4a21-b78f-41fba2f8956e-kube-api-access-krlwh\") pod \"manila-operator-controller-manager-55f864c847-kkr5x\" (UID: \"ec169260-a79f-4a21-b78f-41fba2f8956e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.083962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbcb4\" (UniqueName: \"kubernetes.io/projected/9984519e-49f3-4af4-9c3b-d11af473a940-kube-api-access-xbcb4\") pod \"ironic-operator-controller-manager-6f787dddc9-zg9n4\" (UID: \"9984519e-49f3-4af4-9c3b-d11af473a940\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.084003 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb8h9\" (UniqueName: \"kubernetes.io/projected/fea3f6b7-c840-4795-8ca2-9dba15a49df1-kube-api-access-rb8h9\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.084074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzjl\" (UniqueName: \"kubernetes.io/projected/4fb668bf-a188-428e-b9cc-0f3ff55070fd-kube-api-access-9bzjl\") pod \"keystone-operator-controller-manager-768b96df4c-wvrth\" (UID: \"4fb668bf-a188-428e-b9cc-0f3ff55070fd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.092415 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.119577 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-csd45"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.142367 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgn8j\" (UniqueName: \"kubernetes.io/projected/13a9a76d-7b33-40eb-a7ec-5e5ff3c27705-kube-api-access-zgn8j\") pod \"heat-operator-controller-manager-67dd5f86f5-ztp6j\" (UID: \"13a9a76d-7b33-40eb-a7ec-5e5ff3c27705\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.142668 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tx6\" (UniqueName: \"kubernetes.io/projected/a67ee100-af6d-492d-9a50-40fa8c59256b-kube-api-access-z6tx6\") pod \"horizon-operator-controller-manager-8464cc45fb-8sd9w\" (UID: \"a67ee100-af6d-492d-9a50-40fa8c59256b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.143163 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.146477 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.147578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.151558 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f5xtd" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.166487 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.167804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97c4j\" (UniqueName: \"kubernetes.io/projected/e0809421-a91c-42c6-af2f-c8dc2ae7e856-kube-api-access-97c4j\") pod \"glance-operator-controller-manager-79df6bcc97-x7lgv\" (UID: \"e0809421-a91c-42c6-af2f-c8dc2ae7e856\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.171014 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7vg\" (UniqueName: \"kubernetes.io/projected/00c5d701-8e74-44a0-9880-257001cb0062-kube-api-access-6t7vg\") pod \"designate-operator-controller-manager-588d4d986b-fmtx9\" (UID: \"00c5d701-8e74-44a0-9880-257001cb0062\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.180770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.189935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.191670 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsf76\" (UniqueName: \"kubernetes.io/projected/a9352105-fdd9-4cf9-b073-89a6eda036ab-kube-api-access-nsf76\") pod \"mariadb-operator-controller-manager-67ccfc9778-bcm7p\" (UID: \"a9352105-fdd9-4cf9-b073-89a6eda036ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.191774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlwh\" (UniqueName: \"kubernetes.io/projected/ec169260-a79f-4a21-b78f-41fba2f8956e-kube-api-access-krlwh\") pod \"manila-operator-controller-manager-55f864c847-kkr5x\" (UID: \"ec169260-a79f-4a21-b78f-41fba2f8956e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.190635 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.191922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qs2\" (UniqueName: \"kubernetes.io/projected/a95d67b8-819c-481e-9e68-87276454b88a-kube-api-access-v8qs2\") pod \"neutron-operator-controller-manager-767865f676-csd45\" (UID: \"a95d67b8-819c-481e-9e68-87276454b88a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.192007 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbcb4\" (UniqueName: \"kubernetes.io/projected/9984519e-49f3-4af4-9c3b-d11af473a940-kube-api-access-xbcb4\") pod \"ironic-operator-controller-manager-6f787dddc9-zg9n4\" (UID: \"9984519e-49f3-4af4-9c3b-d11af473a940\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.192095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb8h9\" (UniqueName: \"kubernetes.io/projected/fea3f6b7-c840-4795-8ca2-9dba15a49df1-kube-api-access-rb8h9\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.192159 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:35.692143176 +0000 UTC m=+1170.451595459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.191815 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.192303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzjl\" (UniqueName: \"kubernetes.io/projected/4fb668bf-a188-428e-b9cc-0f3ff55070fd-kube-api-access-9bzjl\") pod \"keystone-operator-controller-manager-768b96df4c-wvrth\" (UID: \"4fb668bf-a188-428e-b9cc-0f3ff55070fd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.203034 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.204033 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.205379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.206162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hr7z8" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.208364 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.212347 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.213638 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlwh\" (UniqueName: \"kubernetes.io/projected/ec169260-a79f-4a21-b78f-41fba2f8956e-kube-api-access-krlwh\") pod \"manila-operator-controller-manager-55f864c847-kkr5x\" (UID: \"ec169260-a79f-4a21-b78f-41fba2f8956e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.214527 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.214599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzjl\" (UniqueName: \"kubernetes.io/projected/4fb668bf-a188-428e-b9cc-0f3ff55070fd-kube-api-access-9bzjl\") pod \"keystone-operator-controller-manager-768b96df4c-wvrth\" (UID: \"4fb668bf-a188-428e-b9cc-0f3ff55070fd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.214723 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.215375 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nng82" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.217723 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-bttcs"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.218962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.219689 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbcb4\" (UniqueName: \"kubernetes.io/projected/9984519e-49f3-4af4-9c3b-d11af473a940-kube-api-access-xbcb4\") pod \"ironic-operator-controller-manager-6f787dddc9-zg9n4\" (UID: \"9984519e-49f3-4af4-9c3b-d11af473a940\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.224544 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-q26kc" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.233489 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.236682 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.238907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb8h9\" (UniqueName: \"kubernetes.io/projected/fea3f6b7-c840-4795-8ca2-9dba15a49df1-kube-api-access-rb8h9\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.256892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.268301 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-bttcs"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.288145 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.292351 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.302871 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.303458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6ttt\" (UniqueName: \"kubernetes.io/projected/0f81862c-c403-445b-8030-083e914d31a7-kube-api-access-h6ttt\") pod \"octavia-operator-controller-manager-5b9f45d989-8gwqk\" (UID: \"0f81862c-c403-445b-8030-083e914d31a7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.303494 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bts\" (UniqueName: \"kubernetes.io/projected/3e3f09b9-2108-4341-9a51-6efee784ca0e-kube-api-access-j2bts\") pod \"nova-operator-controller-manager-5d488d59fb-8s5l4\" (UID: \"3e3f09b9-2108-4341-9a51-6efee784ca0e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.303557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsf76\" (UniqueName: \"kubernetes.io/projected/a9352105-fdd9-4cf9-b073-89a6eda036ab-kube-api-access-nsf76\") pod \"mariadb-operator-controller-manager-67ccfc9778-bcm7p\" (UID: \"a9352105-fdd9-4cf9-b073-89a6eda036ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.303620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qs2\" (UniqueName: \"kubernetes.io/projected/a95d67b8-819c-481e-9e68-87276454b88a-kube-api-access-v8qs2\") pod \"neutron-operator-controller-manager-767865f676-csd45\" (UID: \"a95d67b8-819c-481e-9e68-87276454b88a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.305089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.307206 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4mnrf" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.307377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sscmx" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.309908 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.321847 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.323588 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.328526 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.330177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qs2\" (UniqueName: \"kubernetes.io/projected/a95d67b8-819c-481e-9e68-87276454b88a-kube-api-access-v8qs2\") pod \"neutron-operator-controller-manager-767865f676-csd45\" (UID: \"a95d67b8-819c-481e-9e68-87276454b88a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.330357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.332064 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsf76\" (UniqueName: \"kubernetes.io/projected/a9352105-fdd9-4cf9-b073-89a6eda036ab-kube-api-access-nsf76\") pod \"mariadb-operator-controller-manager-67ccfc9778-bcm7p\" (UID: \"a9352105-fdd9-4cf9-b073-89a6eda036ab\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.332939 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5t9cl" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.338099 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.380175 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.381295 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.382906 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r6ctk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.405860 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406557 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8m2t\" (UniqueName: \"kubernetes.io/projected/cea62bda-461f-4bb3-870b-51b767dd2585-kube-api-access-w8m2t\") pod \"ovn-operator-controller-manager-884679f54-bttcs\" (UID: \"cea62bda-461f-4bb3-870b-51b767dd2585\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406639 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z657f\" (UniqueName: \"kubernetes.io/projected/c2f21978-13ea-4441-ba13-2be2beec2f0a-kube-api-access-z657f\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwgl\" (UniqueName: \"kubernetes.io/projected/35b061e4-ec9c-46e3-828c-d787922370f9-kube-api-access-6hwgl\") pod \"placement-operator-controller-manager-5784578c99-6pqsg\" (UID: \"35b061e4-ec9c-46e3-828c-d787922370f9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406696 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6ttt\" (UniqueName: \"kubernetes.io/projected/0f81862c-c403-445b-8030-083e914d31a7-kube-api-access-h6ttt\") pod \"octavia-operator-controller-manager-5b9f45d989-8gwqk\" (UID: \"0f81862c-c403-445b-8030-083e914d31a7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bts\" (UniqueName: \"kubernetes.io/projected/3e3f09b9-2108-4341-9a51-6efee784ca0e-kube-api-access-j2bts\") pod \"nova-operator-controller-manager-5d488d59fb-8s5l4\" (UID: \"3e3f09b9-2108-4341-9a51-6efee784ca0e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.406804 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknsw\" (UniqueName: \"kubernetes.io/projected/eb7e1883-c95a-4d25-894d-be127f5d4cf3-kube-api-access-rknsw\") pod \"swift-operator-controller-manager-c674c5965-4jtxc\" (UID: \"eb7e1883-c95a-4d25-894d-be127f5d4cf3\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.469509 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bts\" (UniqueName: \"kubernetes.io/projected/3e3f09b9-2108-4341-9a51-6efee784ca0e-kube-api-access-j2bts\") pod \"nova-operator-controller-manager-5d488d59fb-8s5l4\" (UID: \"3e3f09b9-2108-4341-9a51-6efee784ca0e\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.471946 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.472496 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.473415 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.473592 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6ttt\" (UniqueName: \"kubernetes.io/projected/0f81862c-c403-445b-8030-083e914d31a7-kube-api-access-h6ttt\") pod \"octavia-operator-controller-manager-5b9f45d989-8gwqk\" (UID: \"0f81862c-c403-445b-8030-083e914d31a7\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.485020 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fq9tl" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.521766 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.527954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8m2t\" (UniqueName: \"kubernetes.io/projected/cea62bda-461f-4bb3-870b-51b767dd2585-kube-api-access-w8m2t\") pod \"ovn-operator-controller-manager-884679f54-bttcs\" (UID: \"cea62bda-461f-4bb3-870b-51b767dd2585\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.528662 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56pxh\" (UniqueName: \"kubernetes.io/projected/5d879770-dc2d-4c14-a5e1-c80879235d96-kube-api-access-56pxh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zlwf9\" (UID: \"5d879770-dc2d-4c14-a5e1-c80879235d96\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.528800 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z657f\" (UniqueName: \"kubernetes.io/projected/c2f21978-13ea-4441-ba13-2be2beec2f0a-kube-api-access-z657f\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.528833 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dng5j\" (UniqueName: \"kubernetes.io/projected/f8d91ffa-2ac3-4935-95bf-45f6ac41e030-kube-api-access-dng5j\") pod \"telemetry-operator-controller-manager-549b96fcbd-bklr6\" (UID: \"f8d91ffa-2ac3-4935-95bf-45f6ac41e030\") " pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.528885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwgl\" (UniqueName: \"kubernetes.io/projected/35b061e4-ec9c-46e3-828c-d787922370f9-kube-api-access-6hwgl\") pod \"placement-operator-controller-manager-5784578c99-6pqsg\" (UID: \"35b061e4-ec9c-46e3-828c-d787922370f9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.528903 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.529073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknsw\" (UniqueName: \"kubernetes.io/projected/eb7e1883-c95a-4d25-894d-be127f5d4cf3-kube-api-access-rknsw\") pod \"swift-operator-controller-manager-c674c5965-4jtxc\" (UID: \"eb7e1883-c95a-4d25-894d-be127f5d4cf3\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.529494 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.529643 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:41:36.029623248 +0000 UTC m=+1170.789075531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.545154 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8m2t\" (UniqueName: \"kubernetes.io/projected/cea62bda-461f-4bb3-870b-51b767dd2585-kube-api-access-w8m2t\") pod \"ovn-operator-controller-manager-884679f54-bttcs\" (UID: \"cea62bda-461f-4bb3-870b-51b767dd2585\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.546265 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.547431 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.566362 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.568487 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.581932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwgl\" (UniqueName: \"kubernetes.io/projected/35b061e4-ec9c-46e3-828c-d787922370f9-kube-api-access-6hwgl\") pod \"placement-operator-controller-manager-5784578c99-6pqsg\" (UID: \"35b061e4-ec9c-46e3-828c-d787922370f9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.582918 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.583114 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s5cz2" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.583213 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.591032 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.598192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z657f\" (UniqueName: \"kubernetes.io/projected/c2f21978-13ea-4441-ba13-2be2beec2f0a-kube-api-access-z657f\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.598812 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknsw\" (UniqueName: \"kubernetes.io/projected/eb7e1883-c95a-4d25-894d-be127f5d4cf3-kube-api-access-rknsw\") pod \"swift-operator-controller-manager-c674c5965-4jtxc\" (UID: \"eb7e1883-c95a-4d25-894d-be127f5d4cf3\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.600261 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.623546 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.652921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8mqj\" (UniqueName: \"kubernetes.io/projected/599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4-kube-api-access-g8mqj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-54tw8\" (UID: \"599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.652988 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.653015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.653169 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56pxh\" (UniqueName: \"kubernetes.io/projected/5d879770-dc2d-4c14-a5e1-c80879235d96-kube-api-access-56pxh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zlwf9\" (UID: \"5d879770-dc2d-4c14-a5e1-c80879235d96\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.653201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skrhk\" (UniqueName: \"kubernetes.io/projected/eeea77af-84df-4778-8fe7-ddde0c1cda76-kube-api-access-skrhk\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.653233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dng5j\" (UniqueName: \"kubernetes.io/projected/f8d91ffa-2ac3-4935-95bf-45f6ac41e030-kube-api-access-dng5j\") pod \"telemetry-operator-controller-manager-549b96fcbd-bklr6\" (UID: \"f8d91ffa-2ac3-4935-95bf-45f6ac41e030\") " pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.693128 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.703547 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dng5j\" (UniqueName: \"kubernetes.io/projected/f8d91ffa-2ac3-4935-95bf-45f6ac41e030-kube-api-access-dng5j\") pod \"telemetry-operator-controller-manager-549b96fcbd-bklr6\" (UID: \"f8d91ffa-2ac3-4935-95bf-45f6ac41e030\") " pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.735082 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.737449 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.745460 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vfs9z" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.751646 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56pxh\" (UniqueName: \"kubernetes.io/projected/5d879770-dc2d-4c14-a5e1-c80879235d96-kube-api-access-56pxh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zlwf9\" (UID: \"5d879770-dc2d-4c14-a5e1-c80879235d96\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.758547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8mqj\" (UniqueName: \"kubernetes.io/projected/599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4-kube-api-access-g8mqj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-54tw8\" (UID: \"599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.758588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.758617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.758673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.758732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skrhk\" (UniqueName: \"kubernetes.io/projected/eeea77af-84df-4778-8fe7-ddde0c1cda76-kube-api-access-skrhk\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.759246 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.759306 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:36.259286466 +0000 UTC m=+1171.018738749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.759528 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.759571 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:36.759559073 +0000 UTC m=+1171.519011356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.785847 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.789554 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: E0317 00:41:35.789649 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:36.28963018 +0000 UTC m=+1171.049082453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.795679 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.797359 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skrhk\" (UniqueName: \"kubernetes.io/projected/eeea77af-84df-4778-8fe7-ddde0c1cda76-kube-api-access-skrhk\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.817642 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8mqj\" (UniqueName: \"kubernetes.io/projected/599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4-kube-api-access-g8mqj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-54tw8\" (UID: \"599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.838825 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9"] Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.840596 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.861559 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qql\" (UniqueName: \"kubernetes.io/projected/2d644d3f-351b-49ae-b1d6-c5ee0482ca29-kube-api-access-m8qql\") pod \"rabbitmq-cluster-operator-manager-668c99d594-blgfr\" (UID: \"2d644d3f-351b-49ae-b1d6-c5ee0482ca29\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.890071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:35 crc kubenswrapper[4755]: W0317 00:41:35.890517 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3adfd998_aade_4343_8952_50b0eba8b510.slice/crio-755ad3419b85f1032f42bc006f65ba7abcdee2bce1616915cb770d7e7e018538 WatchSource:0}: Error finding container 755ad3419b85f1032f42bc006f65ba7abcdee2bce1616915cb770d7e7e018538: Status 404 returned error can't find the container with id 755ad3419b85f1032f42bc006f65ba7abcdee2bce1616915cb770d7e7e018538 Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.938490 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.964256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qql\" (UniqueName: \"kubernetes.io/projected/2d644d3f-351b-49ae-b1d6-c5ee0482ca29-kube-api-access-m8qql\") pod \"rabbitmq-cluster-operator-manager-668c99d594-blgfr\" (UID: \"2d644d3f-351b-49ae-b1d6-c5ee0482ca29\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" Mar 17 00:41:35 crc kubenswrapper[4755]: I0317 00:41:35.991651 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qql\" (UniqueName: \"kubernetes.io/projected/2d644d3f-351b-49ae-b1d6-c5ee0482ca29-kube-api-access-m8qql\") pod \"rabbitmq-cluster-operator-manager-668c99d594-blgfr\" (UID: \"2d644d3f-351b-49ae-b1d6-c5ee0482ca29\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.015852 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.022243 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.065624 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.065779 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.065823 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:41:37.065808695 +0000 UTC m=+1171.825260968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.103073 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.141549 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.153805 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.164423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w"] Mar 17 00:41:36 crc kubenswrapper[4755]: W0317 00:41:36.190263 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67ee100_af6d_492d_9a50_40fa8c59256b.slice/crio-fe73bf0b84bc028adc8b91e08c13ca060dfe517578e0d648505eb21938300071 WatchSource:0}: Error finding container fe73bf0b84bc028adc8b91e08c13ca060dfe517578e0d648505eb21938300071: Status 404 returned error can't find the container with id fe73bf0b84bc028adc8b91e08c13ca060dfe517578e0d648505eb21938300071 Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.269014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.269219 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.269292 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:37.269274063 +0000 UTC m=+1172.028726346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.298398 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.370373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.372292 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.372342 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:37.372327763 +0000 UTC m=+1172.131780036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.589355 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.601155 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9"] Mar 17 00:41:36 crc kubenswrapper[4755]: W0317 00:41:36.619802 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00c5d701_8e74_44a0_9880_257001cb0062.slice/crio-6d338ec3a27c71ca2742f7bbe9c78e68af84073ef2888dc892b1ff4cb60a3f27 WatchSource:0}: Error finding container 6d338ec3a27c71ca2742f7bbe9c78e68af84073ef2888dc892b1ff4cb60a3f27: Status 404 returned error can't find the container with id 6d338ec3a27c71ca2742f7bbe9c78e68af84073ef2888dc892b1ff4cb60a3f27 Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.626731 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth"] Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.708240 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-csd45"] Mar 17 00:41:36 crc kubenswrapper[4755]: W0317 00:41:36.714670 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95d67b8_819c_481e_9e68_87276454b88a.slice/crio-2fd94f284a837b5e00c39417d77a946afb5ddb7ddc23dba8fa7303dc18b0c874 WatchSource:0}: Error finding container 2fd94f284a837b5e00c39417d77a946afb5ddb7ddc23dba8fa7303dc18b0c874: Status 404 returned error can't find the container with id 2fd94f284a837b5e00c39417d77a946afb5ddb7ddc23dba8fa7303dc18b0c874 Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.782649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.782922 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: E0317 00:41:36.782969 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:38.782953393 +0000 UTC m=+1173.542405676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.831146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" event={"ID":"a67ee100-af6d-492d-9a50-40fa8c59256b","Type":"ContainerStarted","Data":"fe73bf0b84bc028adc8b91e08c13ca060dfe517578e0d648505eb21938300071"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.832783 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" event={"ID":"00c5d701-8e74-44a0-9880-257001cb0062","Type":"ContainerStarted","Data":"6d338ec3a27c71ca2742f7bbe9c78e68af84073ef2888dc892b1ff4cb60a3f27"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.834054 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" event={"ID":"4fb668bf-a188-428e-b9cc-0f3ff55070fd","Type":"ContainerStarted","Data":"12c2a0c9b2cc4ab3118912fa455669b5ea55b999aa20860fa3c75d30c6a9ca5f"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.835073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" event={"ID":"3adfd998-aade-4343-8952-50b0eba8b510","Type":"ContainerStarted","Data":"755ad3419b85f1032f42bc006f65ba7abcdee2bce1616915cb770d7e7e018538"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.836360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" event={"ID":"9984519e-49f3-4af4-9c3b-d11af473a940","Type":"ContainerStarted","Data":"e8b119d61bdd1d41f99a47291fc05f0aa0eb0fd53633128a73b15f288d13fa70"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.838141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" event={"ID":"a95d67b8-819c-481e-9e68-87276454b88a","Type":"ContainerStarted","Data":"2fd94f284a837b5e00c39417d77a946afb5ddb7ddc23dba8fa7303dc18b0c874"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.839347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" event={"ID":"b5e695a0-9a52-46f0-8aae-3a4353bb3345","Type":"ContainerStarted","Data":"e812c2f4ef7bfbe82122e2e7c7bb5c47bbd714582bf2211a172077c9061c10db"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.840865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" event={"ID":"ec169260-a79f-4a21-b78f-41fba2f8956e","Type":"ContainerStarted","Data":"cab8cef665d0a33533a0e8e2f7738efb666f177cc5a16fa61f31eeab41399302"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.842246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" event={"ID":"e0809421-a91c-42c6-af2f-c8dc2ae7e856","Type":"ContainerStarted","Data":"f1bd0bfb79a305093b6950808a4da9816df2e3e67fe3bfe2e55077b6f0c23363"} Mar 17 00:41:36 crc kubenswrapper[4755]: I0317 00:41:36.844018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" event={"ID":"13a9a76d-7b33-40eb-a7ec-5e5ff3c27705","Type":"ContainerStarted","Data":"54c3e5b68ddbc87cc164a6ceb1a175591e73ed4b9ff326f82c214d583027a883"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.058961 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.074730 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-bttcs"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.084636 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.088053 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.088223 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.088296 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:41:39.088275199 +0000 UTC m=+1173.847727542 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.113577 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rknsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-4jtxc_openstack-operators(eb7e1883-c95a-4d25-894d-be127f5d4cf3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.113928 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4"] Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.114772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" podUID="eb7e1883-c95a-4d25-894d-be127f5d4cf3" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.129671 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p"] Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.131244 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hwgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-6pqsg_openstack-operators(35b061e4-ec9c-46e3-828c-d787922370f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.131449 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2bts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-8s5l4_openstack-operators(3e3f09b9-2108-4341-9a51-6efee784ca0e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.132720 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" podUID="35b061e4-ec9c-46e3-828c-d787922370f9" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.132779 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" podUID="3e3f09b9-2108-4341-9a51-6efee784ca0e" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.158433 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc"] Mar 17 00:41:37 crc kubenswrapper[4755]: W0317 00:41:37.160854 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d879770_dc2d_4c14_a5e1_c80879235d96.slice/crio-7db3930a0377ed63a28e04f7069c438e2af837e31b86c880bb9802ce97b013cf WatchSource:0}: Error finding container 7db3930a0377ed63a28e04f7069c438e2af837e31b86c880bb9802ce97b013cf: Status 404 returned error can't find the container with id 7db3930a0377ed63a28e04f7069c438e2af837e31b86c880bb9802ce97b013cf Mar 17 00:41:37 crc kubenswrapper[4755]: W0317 00:41:37.206043 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod599a37c9_2c1a_4b46_8cfd_1e8c5ea709a4.slice/crio-69a6a4fb3c7d5920b5c92483777f788d01ab838339f7da6b4bacb3cb16422f86 WatchSource:0}: Error finding container 69a6a4fb3c7d5920b5c92483777f788d01ab838339f7da6b4bacb3cb16422f86: Status 404 returned error can't find the container with id 69a6a4fb3c7d5920b5c92483777f788d01ab838339f7da6b4bacb3cb16422f86 Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.211081 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9"] Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.213661 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8mqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-54tw8_openstack-operators(599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.215662 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" podUID="599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.215721 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.221853 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.227405 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8"] Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.293860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.294010 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.294067 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:39.294048521 +0000 UTC m=+1174.053500804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.396172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.396348 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.396628 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:39.396610148 +0000 UTC m=+1174.156062431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.855998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" event={"ID":"2d644d3f-351b-49ae-b1d6-c5ee0482ca29","Type":"ContainerStarted","Data":"f9824d77ddd56b4f0985b9ba1080a6cecda929418b44e41b95f4187d95747a4f"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.857606 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" event={"ID":"5d879770-dc2d-4c14-a5e1-c80879235d96","Type":"ContainerStarted","Data":"7db3930a0377ed63a28e04f7069c438e2af837e31b86c880bb9802ce97b013cf"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.858595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" event={"ID":"599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4","Type":"ContainerStarted","Data":"69a6a4fb3c7d5920b5c92483777f788d01ab838339f7da6b4bacb3cb16422f86"} Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.860137 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" podUID="599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.861460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" event={"ID":"cea62bda-461f-4bb3-870b-51b767dd2585","Type":"ContainerStarted","Data":"40be7e26f4419c8403acffdb689a4beafb29875cc3bd6f214ac6afa53615e610"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.864057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" event={"ID":"a9352105-fdd9-4cf9-b073-89a6eda036ab","Type":"ContainerStarted","Data":"5f974e7dfd8322559baf320f6520e1b290ef643e8ea11d99a156823166f8af97"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.866280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" event={"ID":"f8d91ffa-2ac3-4935-95bf-45f6ac41e030","Type":"ContainerStarted","Data":"18b61afc6b56c84bd5db47c5166a6de5fba250150378fa3cbe556dd0ced969f6"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.867479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" event={"ID":"eb7e1883-c95a-4d25-894d-be127f5d4cf3","Type":"ContainerStarted","Data":"0554b9bc0d0e5923d97806f453f7a15eae8f9853f3505a676de9f2c828fd11f5"} Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.869231 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" podUID="eb7e1883-c95a-4d25-894d-be127f5d4cf3" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.869574 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" event={"ID":"35b061e4-ec9c-46e3-828c-d787922370f9","Type":"ContainerStarted","Data":"4e1aa25d6039c2162401dff98506203ebc073c1fa50ed92c5e1bebe1e40836c2"} Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.870909 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" podUID="35b061e4-ec9c-46e3-828c-d787922370f9" Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.871452 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" event={"ID":"0f81862c-c403-445b-8030-083e914d31a7","Type":"ContainerStarted","Data":"f7ff3e9ae929f2a89a7278f452c2a81e36f51fe7aa0df90f0d6eaf1c7be312e8"} Mar 17 00:41:37 crc kubenswrapper[4755]: I0317 00:41:37.873145 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" event={"ID":"3e3f09b9-2108-4341-9a51-6efee784ca0e","Type":"ContainerStarted","Data":"263a82c83a377d34335576623b55a3e19f57938d8dc3352daab0565de29b7003"} Mar 17 00:41:37 crc kubenswrapper[4755]: E0317 00:41:37.874857 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" podUID="3e3f09b9-2108-4341-9a51-6efee784ca0e" Mar 17 00:41:38 crc kubenswrapper[4755]: I0317 00:41:38.818532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.818764 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.818867 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:42.818844959 +0000 UTC m=+1177.578297242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.882665 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" podUID="eb7e1883-c95a-4d25-894d-be127f5d4cf3" Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.882665 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" podUID="35b061e4-ec9c-46e3-828c-d787922370f9" Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.883636 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" podUID="3e3f09b9-2108-4341-9a51-6efee784ca0e" Mar 17 00:41:38 crc kubenswrapper[4755]: E0317 00:41:38.883873 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" podUID="599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4" Mar 17 00:41:39 crc kubenswrapper[4755]: I0317 00:41:39.123775 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.124220 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.124496 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:41:43.124469863 +0000 UTC m=+1177.883922166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:39 crc kubenswrapper[4755]: I0317 00:41:39.329677 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.329941 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.330023 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:43.329987339 +0000 UTC m=+1178.089439622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:39 crc kubenswrapper[4755]: I0317 00:41:39.431790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.432053 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:39 crc kubenswrapper[4755]: E0317 00:41:39.432101 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:43.432087633 +0000 UTC m=+1178.191539916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:42 crc kubenswrapper[4755]: I0317 00:41:42.903118 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:42 crc kubenswrapper[4755]: E0317 00:41:42.903340 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:42 crc kubenswrapper[4755]: E0317 00:41:42.903646 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:50.903627392 +0000 UTC m=+1185.663079675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: I0317 00:41:43.208107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.208366 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.208423 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:41:51.208406253 +0000 UTC m=+1185.967858536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: I0317 00:41:43.410494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.410690 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.410789 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:51.41076535 +0000 UTC m=+1186.170217623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: I0317 00:41:43.511984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.512145 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:43 crc kubenswrapper[4755]: E0317 00:41:43.512238 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:41:51.512221307 +0000 UTC m=+1186.271673590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:50 crc kubenswrapper[4755]: E0317 00:41:50.869178 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 17 00:41:50 crc kubenswrapper[4755]: E0317 00:41:50.869817 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krlwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-kkr5x_openstack-operators(ec169260-a79f-4a21-b78f-41fba2f8956e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:41:50 crc kubenswrapper[4755]: E0317 00:41:50.871016 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" podUID="ec169260-a79f-4a21-b78f-41fba2f8956e" Mar 17 00:41:50 crc kubenswrapper[4755]: I0317 00:41:50.937953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:41:50 crc kubenswrapper[4755]: E0317 00:41:50.938083 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:50 crc kubenswrapper[4755]: E0317 00:41:50.938144 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert podName:fea3f6b7-c840-4795-8ca2-9dba15a49df1 nodeName:}" failed. No retries permitted until 2026-03-17 00:42:06.938128019 +0000 UTC m=+1201.697580302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert") pod "infra-operator-controller-manager-7b9c774f96-hx685" (UID: "fea3f6b7-c840-4795-8ca2-9dba15a49df1") : secret "infra-operator-webhook-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.019356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" podUID="ec169260-a79f-4a21-b78f-41fba2f8956e" Mar 17 00:41:51 crc kubenswrapper[4755]: I0317 00:41:51.251395 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.251614 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.252320 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert podName:c2f21978-13ea-4441-ba13-2be2beec2f0a nodeName:}" failed. No retries permitted until 2026-03-17 00:42:07.25229833 +0000 UTC m=+1202.011750613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" (UID: "c2f21978-13ea-4441-ba13-2be2beec2f0a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: I0317 00:41:51.457091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.457295 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.457379 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:42:07.457355502 +0000 UTC m=+1202.216807785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "webhook-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: I0317 00:41:51.558726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.558913 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 00:41:51 crc kubenswrapper[4755]: E0317 00:41:51.559006 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs podName:eeea77af-84df-4778-8fe7-ddde0c1cda76 nodeName:}" failed. No retries permitted until 2026-03-17 00:42:07.558981414 +0000 UTC m=+1202.318433697 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs") pod "openstack-operator-controller-manager-6d474745d9-7q6lg" (UID: "eeea77af-84df-4778-8fe7-ddde0c1cda76") : secret "metrics-server-cert" not found Mar 17 00:41:52 crc kubenswrapper[4755]: I0317 00:41:52.023448 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" event={"ID":"13a9a76d-7b33-40eb-a7ec-5e5ff3c27705","Type":"ContainerStarted","Data":"9daa1d76f684b9016b3e079cc4436f2b0cb4382e745ebe225ab29aad76c3ea7e"} Mar 17 00:41:52 crc kubenswrapper[4755]: I0317 00:41:52.024380 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:41:52 crc kubenswrapper[4755]: I0317 00:41:52.040656 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" podStartSLOduration=8.017029965 podStartE2EDuration="18.040638402s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.317052044 +0000 UTC m=+1171.076504327" lastFinishedPulling="2026-03-17 00:41:46.340660481 +0000 UTC m=+1181.100112764" observedRunningTime="2026-03-17 00:41:52.039664945 +0000 UTC m=+1186.799117228" watchObservedRunningTime="2026-03-17 00:41:52.040638402 +0000 UTC m=+1186.800090685" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.031633 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" event={"ID":"b5e695a0-9a52-46f0-8aae-3a4353bb3345","Type":"ContainerStarted","Data":"5e8ffff11674531be0e8df3fd6e6282569dd62f6d63bd5f890f949da3e7ba33a"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.032134 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.033192 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" event={"ID":"a9352105-fdd9-4cf9-b073-89a6eda036ab","Type":"ContainerStarted","Data":"0c9a0489a5b28cb70958445e343b99edfc6d3211a36b163a1f1e19f28c40fb4c"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.033344 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.035529 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" event={"ID":"2d644d3f-351b-49ae-b1d6-c5ee0482ca29","Type":"ContainerStarted","Data":"cfdea5403ab85e66b614dbaeb921d269fc43ee30586edc0234b70afe22274c93"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.036956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" event={"ID":"5d879770-dc2d-4c14-a5e1-c80879235d96","Type":"ContainerStarted","Data":"097cfa8c24b7b47de5eecbfc3d0adfc96916842c63bc4c6efe6d4406fa92401a"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.037029 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.038344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" event={"ID":"3adfd998-aade-4343-8952-50b0eba8b510","Type":"ContainerStarted","Data":"c53ceec8db90fe9c8ab9676a172989c093184ac0be2bdef831e4cc5c4f561999"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.038387 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.049244 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" event={"ID":"a95d67b8-819c-481e-9e68-87276454b88a","Type":"ContainerStarted","Data":"4b7d456f78a15e0b3758f20a422ad2ff36702860a378fceb7faf7a9048c7eb0a"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.050106 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.051706 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" event={"ID":"a67ee100-af6d-492d-9a50-40fa8c59256b","Type":"ContainerStarted","Data":"a52064bb9100b2d45c24498366e3d966ee049ebb24a69af27ed04a877ccf6678"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.052223 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.053396 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" podStartSLOduration=3.506241262 podStartE2EDuration="19.053382645s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.071849883 +0000 UTC m=+1170.831302166" lastFinishedPulling="2026-03-17 00:41:51.618991266 +0000 UTC m=+1186.378443549" observedRunningTime="2026-03-17 00:41:53.044477587 +0000 UTC m=+1187.803929890" watchObservedRunningTime="2026-03-17 00:41:53.053382645 +0000 UTC m=+1187.812834928" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.057927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" event={"ID":"00c5d701-8e74-44a0-9880-257001cb0062","Type":"ContainerStarted","Data":"4e79233ffb8e831f2175ae4fec770b9ee71123d76bdd2aefd95ca35cd565a784"} Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.057964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.069863 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-blgfr" podStartSLOduration=3.540751132 podStartE2EDuration="18.069843594s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.094498632 +0000 UTC m=+1171.853950915" lastFinishedPulling="2026-03-17 00:41:51.623591104 +0000 UTC m=+1186.383043377" observedRunningTime="2026-03-17 00:41:53.059699371 +0000 UTC m=+1187.819151654" watchObservedRunningTime="2026-03-17 00:41:53.069843594 +0000 UTC m=+1187.829295887" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.085910 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" podStartSLOduration=3.636677974 podStartE2EDuration="18.0858955s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.169244035 +0000 UTC m=+1171.928696318" lastFinishedPulling="2026-03-17 00:41:51.618461561 +0000 UTC m=+1186.377913844" observedRunningTime="2026-03-17 00:41:53.080937293 +0000 UTC m=+1187.840389576" watchObservedRunningTime="2026-03-17 00:41:53.0858955 +0000 UTC m=+1187.845347783" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.104786 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" podStartSLOduration=4.585976291 podStartE2EDuration="19.104767137s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.099855651 +0000 UTC m=+1171.859307934" lastFinishedPulling="2026-03-17 00:41:51.618646497 +0000 UTC m=+1186.378098780" observedRunningTime="2026-03-17 00:41:53.098795401 +0000 UTC m=+1187.858247684" watchObservedRunningTime="2026-03-17 00:41:53.104767137 +0000 UTC m=+1187.864219420" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.119420 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" podStartSLOduration=6.383877859 podStartE2EDuration="19.119402944s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:35.941154793 +0000 UTC m=+1170.700607076" lastFinishedPulling="2026-03-17 00:41:48.676679878 +0000 UTC m=+1183.436132161" observedRunningTime="2026-03-17 00:41:53.116522554 +0000 UTC m=+1187.875974837" watchObservedRunningTime="2026-03-17 00:41:53.119402944 +0000 UTC m=+1187.878855227" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.138978 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" podStartSLOduration=4.246263438 podStartE2EDuration="19.138958819s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.716965445 +0000 UTC m=+1171.476417728" lastFinishedPulling="2026-03-17 00:41:51.609660826 +0000 UTC m=+1186.369113109" observedRunningTime="2026-03-17 00:41:53.132784587 +0000 UTC m=+1187.892236870" watchObservedRunningTime="2026-03-17 00:41:53.138958819 +0000 UTC m=+1187.898411102" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.150697 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" podStartSLOduration=4.244465998 podStartE2EDuration="19.150682956s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.633015266 +0000 UTC m=+1171.392467549" lastFinishedPulling="2026-03-17 00:41:51.539232214 +0000 UTC m=+1186.298684507" observedRunningTime="2026-03-17 00:41:53.146596412 +0000 UTC m=+1187.906048695" watchObservedRunningTime="2026-03-17 00:41:53.150682956 +0000 UTC m=+1187.910135229" Mar 17 00:41:53 crc kubenswrapper[4755]: I0317 00:41:53.166899 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" podStartSLOduration=9.020268012 podStartE2EDuration="19.166870346s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.193752279 +0000 UTC m=+1170.953204562" lastFinishedPulling="2026-03-17 00:41:46.340354613 +0000 UTC m=+1181.099806896" observedRunningTime="2026-03-17 00:41:53.163522964 +0000 UTC m=+1187.922975247" watchObservedRunningTime="2026-03-17 00:41:53.166870346 +0000 UTC m=+1187.926322629" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.072358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" event={"ID":"4fb668bf-a188-428e-b9cc-0f3ff55070fd","Type":"ContainerStarted","Data":"274be8e4ea6910a426442567c8c81f302abcd77def276d8b59cef4735c974316"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.072656 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.074042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" event={"ID":"f8d91ffa-2ac3-4935-95bf-45f6ac41e030","Type":"ContainerStarted","Data":"7b37cb9a98d2bdb45911a38f589ae5bfefc2ef1622660e335a7c743baee3e4ba"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.074805 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.075831 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" event={"ID":"0f81862c-c403-445b-8030-083e914d31a7","Type":"ContainerStarted","Data":"5baf649c12e3b746bc651033ba518b80a3bbb55ac15b34cd2115474525a8b364"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.076239 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.077680 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" event={"ID":"3e3f09b9-2108-4341-9a51-6efee784ca0e","Type":"ContainerStarted","Data":"634fc9dce865f92e8c20182ef76930e447c2bc2761f2d0935dea7d7097cd83b7"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.078058 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.080568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" event={"ID":"e0809421-a91c-42c6-af2f-c8dc2ae7e856","Type":"ContainerStarted","Data":"a7e7032d458d66b9338309b96ff97913b43a82b84caf960c5a6a636fb8350b9b"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.080966 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.114697 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" event={"ID":"9984519e-49f3-4af4-9c3b-d11af473a940","Type":"ContainerStarted","Data":"110d2eae3ba3534f341a1d028c8db475386fd0112653f86df2d617c361a1d36f"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.115413 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.130636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" event={"ID":"cea62bda-461f-4bb3-870b-51b767dd2585","Type":"ContainerStarted","Data":"36eb9e10516a094694fad320ea5bd793ded94e26be672f91a37ce8f907984bb6"} Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.130673 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.178877 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" podStartSLOduration=5.411643455 podStartE2EDuration="20.178862019s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.179230855 +0000 UTC m=+1170.938683138" lastFinishedPulling="2026-03-17 00:41:50.946449419 +0000 UTC m=+1185.705901702" observedRunningTime="2026-03-17 00:41:54.16777002 +0000 UTC m=+1188.927222303" watchObservedRunningTime="2026-03-17 00:41:54.178862019 +0000 UTC m=+1188.938314302" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.181077 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" podStartSLOduration=5.04395754 podStartE2EDuration="20.181072391s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.64249123 +0000 UTC m=+1171.401943513" lastFinishedPulling="2026-03-17 00:41:51.779606081 +0000 UTC m=+1186.539058364" observedRunningTime="2026-03-17 00:41:54.143847143 +0000 UTC m=+1188.903299426" watchObservedRunningTime="2026-03-17 00:41:54.181072391 +0000 UTC m=+1188.940524674" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.221523 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" podStartSLOduration=4.721515206 podStartE2EDuration="19.221509067s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.108167213 +0000 UTC m=+1171.867619486" lastFinishedPulling="2026-03-17 00:41:51.608161064 +0000 UTC m=+1186.367613347" observedRunningTime="2026-03-17 00:41:54.214727588 +0000 UTC m=+1188.974179871" watchObservedRunningTime="2026-03-17 00:41:54.221509067 +0000 UTC m=+1188.980961350" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.280914 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" podStartSLOduration=3.990944816 podStartE2EDuration="20.280896852s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.131276807 +0000 UTC m=+1171.890729090" lastFinishedPulling="2026-03-17 00:41:53.421228843 +0000 UTC m=+1188.180681126" observedRunningTime="2026-03-17 00:41:54.2758188 +0000 UTC m=+1189.035271083" watchObservedRunningTime="2026-03-17 00:41:54.280896852 +0000 UTC m=+1189.040349135" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.353914 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" podStartSLOduration=4.840282594 podStartE2EDuration="19.353883625s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.104309325 +0000 UTC m=+1171.863761608" lastFinishedPulling="2026-03-17 00:41:51.617910356 +0000 UTC m=+1186.377362639" observedRunningTime="2026-03-17 00:41:54.310561998 +0000 UTC m=+1189.070014281" watchObservedRunningTime="2026-03-17 00:41:54.353883625 +0000 UTC m=+1189.113335908" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.374196 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" podStartSLOduration=4.754384591 podStartE2EDuration="19.3741752s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.090277824 +0000 UTC m=+1171.849730107" lastFinishedPulling="2026-03-17 00:41:51.710068433 +0000 UTC m=+1186.469520716" observedRunningTime="2026-03-17 00:41:54.355639843 +0000 UTC m=+1189.115092126" watchObservedRunningTime="2026-03-17 00:41:54.3741752 +0000 UTC m=+1189.133627483" Mar 17 00:41:54 crc kubenswrapper[4755]: I0317 00:41:54.398261 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" podStartSLOduration=5.394857446 podStartE2EDuration="20.398246051s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.606154647 +0000 UTC m=+1171.365606930" lastFinishedPulling="2026-03-17 00:41:51.609543252 +0000 UTC m=+1186.368995535" observedRunningTime="2026-03-17 00:41:54.396130721 +0000 UTC m=+1189.155583024" watchObservedRunningTime="2026-03-17 00:41:54.398246051 +0000 UTC m=+1189.157698324" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.169002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" event={"ID":"599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4","Type":"ContainerStarted","Data":"e5bb49677923204499e26d8d79cc48d37c37fa9ac043793f79ca21ce77df594d"} Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.170714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.174127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" event={"ID":"eb7e1883-c95a-4d25-894d-be127f5d4cf3","Type":"ContainerStarted","Data":"2bc8b46658168e8d8782d1ff9ea5d2c2efefdcea0d7c7642280216b82228867c"} Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.174369 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.175510 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" event={"ID":"35b061e4-ec9c-46e3-828c-d787922370f9","Type":"ContainerStarted","Data":"ce265058825688daccbacbbefb5b0f63c0281449cc974bff72d7f2d42e97ad92"} Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.175668 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.191271 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" podStartSLOduration=3.114831276 podStartE2EDuration="23.191252867s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.213518607 +0000 UTC m=+1171.972970900" lastFinishedPulling="2026-03-17 00:41:57.289940208 +0000 UTC m=+1192.049392491" observedRunningTime="2026-03-17 00:41:58.184658833 +0000 UTC m=+1192.944111136" watchObservedRunningTime="2026-03-17 00:41:58.191252867 +0000 UTC m=+1192.950705150" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.201671 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" podStartSLOduration=3.040659769 podStartE2EDuration="23.201652756s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.131091351 +0000 UTC m=+1171.890543624" lastFinishedPulling="2026-03-17 00:41:57.292084318 +0000 UTC m=+1192.051536611" observedRunningTime="2026-03-17 00:41:58.200583866 +0000 UTC m=+1192.960036149" watchObservedRunningTime="2026-03-17 00:41:58.201652756 +0000 UTC m=+1192.961105039" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.218097 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" podStartSLOduration=3.04957538 podStartE2EDuration="23.218082504s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:41:37.113392179 +0000 UTC m=+1171.872844462" lastFinishedPulling="2026-03-17 00:41:57.281899303 +0000 UTC m=+1192.041351586" observedRunningTime="2026-03-17 00:41:58.214169575 +0000 UTC m=+1192.973621858" watchObservedRunningTime="2026-03-17 00:41:58.218082504 +0000 UTC m=+1192.977534787" Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.665662 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:41:58 crc kubenswrapper[4755]: I0317 00:41:58.666121 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.165667 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561802-thhsv"] Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.167509 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.170583 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.170635 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.170702 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.191317 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561802-thhsv"] Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.244468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dpdz\" (UniqueName: \"kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz\") pod \"auto-csr-approver-29561802-thhsv\" (UID: \"de73bb77-da64-4a58-bf81-d617192672f2\") " pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.346999 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dpdz\" (UniqueName: \"kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz\") pod \"auto-csr-approver-29561802-thhsv\" (UID: \"de73bb77-da64-4a58-bf81-d617192672f2\") " pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.379586 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dpdz\" (UniqueName: \"kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz\") pod \"auto-csr-approver-29561802-thhsv\" (UID: \"de73bb77-da64-4a58-bf81-d617192672f2\") " pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:00 crc kubenswrapper[4755]: I0317 00:42:00.491617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:01 crc kubenswrapper[4755]: I0317 00:42:01.025715 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561802-thhsv"] Mar 17 00:42:01 crc kubenswrapper[4755]: I0317 00:42:01.198516 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561802-thhsv" event={"ID":"de73bb77-da64-4a58-bf81-d617192672f2","Type":"ContainerStarted","Data":"c218eb46cb9e60a6848513c2c0f2f21b5f0c27231e6c4210ea93686794bc1af7"} Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.107980 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-r84f9" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.147060 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c9nzt" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.183829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-x7lgv" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.194902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ztp6j" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.213795 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-8sd9w" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.262888 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-zg9n4" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.326161 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wvrth" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.475092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-fmtx9" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.549056 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-csd45" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.549504 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bcm7p" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.593358 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-8s5l4" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.627303 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8gwqk" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.699629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-bttcs" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.788981 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6pqsg" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.845732 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4jtxc" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.893834 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-549b96fcbd-bklr6" Mar 17 00:42:05 crc kubenswrapper[4755]: I0317 00:42:05.942861 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zlwf9" Mar 17 00:42:06 crc kubenswrapper[4755]: I0317 00:42:06.024712 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-54tw8" Mar 17 00:42:06 crc kubenswrapper[4755]: I0317 00:42:06.975285 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:42:06 crc kubenswrapper[4755]: I0317 00:42:06.982106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea3f6b7-c840-4795-8ca2-9dba15a49df1-cert\") pod \"infra-operator-controller-manager-7b9c774f96-hx685\" (UID: \"fea3f6b7-c840-4795-8ca2-9dba15a49df1\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.034493 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xs8pf" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.042998 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.283654 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.292184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2f21978-13ea-4441-ba13-2be2beec2f0a-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-kvfkt\" (UID: \"c2f21978-13ea-4441-ba13-2be2beec2f0a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.446875 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nng82" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.455739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.488365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.492590 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-webhook-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.589888 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.595976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eeea77af-84df-4778-8fe7-ddde0c1cda76-metrics-certs\") pod \"openstack-operator-controller-manager-6d474745d9-7q6lg\" (UID: \"eeea77af-84df-4778-8fe7-ddde0c1cda76\") " pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.892112 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-s5cz2" Mar 17 00:42:07 crc kubenswrapper[4755]: I0317 00:42:07.897831 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:08 crc kubenswrapper[4755]: I0317 00:42:08.460764 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt"] Mar 17 00:42:08 crc kubenswrapper[4755]: W0317 00:42:08.464800 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f21978_13ea_4441_ba13_2be2beec2f0a.slice/crio-9af5688484f815d9adecd01dec71b30b61ad3b3a99b577ddebf5ed23e7cd7bdb WatchSource:0}: Error finding container 9af5688484f815d9adecd01dec71b30b61ad3b3a99b577ddebf5ed23e7cd7bdb: Status 404 returned error can't find the container with id 9af5688484f815d9adecd01dec71b30b61ad3b3a99b577ddebf5ed23e7cd7bdb Mar 17 00:42:08 crc kubenswrapper[4755]: I0317 00:42:08.518137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685"] Mar 17 00:42:08 crc kubenswrapper[4755]: W0317 00:42:08.524230 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea3f6b7_c840_4795_8ca2_9dba15a49df1.slice/crio-d8d16388817e64553a8ca506e6c188ea431a14b622908dbaac1eb4d23acbb280 WatchSource:0}: Error finding container d8d16388817e64553a8ca506e6c188ea431a14b622908dbaac1eb4d23acbb280: Status 404 returned error can't find the container with id d8d16388817e64553a8ca506e6c188ea431a14b622908dbaac1eb4d23acbb280 Mar 17 00:42:08 crc kubenswrapper[4755]: I0317 00:42:08.544713 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg"] Mar 17 00:42:08 crc kubenswrapper[4755]: W0317 00:42:08.551376 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeea77af_84df_4778_8fe7_ddde0c1cda76.slice/crio-2510d7d24ddae1793a94defd75721107d48ec2f1ae2822b67b4c0844410b55fe WatchSource:0}: Error finding container 2510d7d24ddae1793a94defd75721107d48ec2f1ae2822b67b4c0844410b55fe: Status 404 returned error can't find the container with id 2510d7d24ddae1793a94defd75721107d48ec2f1ae2822b67b4c0844410b55fe Mar 17 00:42:09 crc kubenswrapper[4755]: I0317 00:42:09.276198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" event={"ID":"eeea77af-84df-4778-8fe7-ddde0c1cda76","Type":"ContainerStarted","Data":"2510d7d24ddae1793a94defd75721107d48ec2f1ae2822b67b4c0844410b55fe"} Mar 17 00:42:09 crc kubenswrapper[4755]: I0317 00:42:09.279041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" event={"ID":"fea3f6b7-c840-4795-8ca2-9dba15a49df1","Type":"ContainerStarted","Data":"d8d16388817e64553a8ca506e6c188ea431a14b622908dbaac1eb4d23acbb280"} Mar 17 00:42:09 crc kubenswrapper[4755]: I0317 00:42:09.281066 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" event={"ID":"c2f21978-13ea-4441-ba13-2be2beec2f0a","Type":"ContainerStarted","Data":"9af5688484f815d9adecd01dec71b30b61ad3b3a99b577ddebf5ed23e7cd7bdb"} Mar 17 00:42:10 crc kubenswrapper[4755]: I0317 00:42:10.297072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" event={"ID":"eeea77af-84df-4778-8fe7-ddde0c1cda76","Type":"ContainerStarted","Data":"fe8531f23f9529573536c1e7c314db8cc75197d6e1ebb1d7c0271db810900e1b"} Mar 17 00:42:10 crc kubenswrapper[4755]: I0317 00:42:10.297405 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:10 crc kubenswrapper[4755]: I0317 00:42:10.336395 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" podStartSLOduration=35.336315596 podStartE2EDuration="35.336315596s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:42:10.331055849 +0000 UTC m=+1205.090508172" watchObservedRunningTime="2026-03-17 00:42:10.336315596 +0000 UTC m=+1205.095767899" Mar 17 00:42:11 crc kubenswrapper[4755]: I0317 00:42:11.306207 4755 generic.go:334] "Generic (PLEG): container finished" podID="de73bb77-da64-4a58-bf81-d617192672f2" containerID="e9fea430d485f9b2eaa3aee2eb7b8e070f9bea45ff99708e98a7401ea0e25d3a" exitCode=0 Mar 17 00:42:11 crc kubenswrapper[4755]: I0317 00:42:11.306390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561802-thhsv" event={"ID":"de73bb77-da64-4a58-bf81-d617192672f2","Type":"ContainerDied","Data":"e9fea430d485f9b2eaa3aee2eb7b8e070f9bea45ff99708e98a7401ea0e25d3a"} Mar 17 00:42:11 crc kubenswrapper[4755]: I0317 00:42:11.314405 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" event={"ID":"ec169260-a79f-4a21-b78f-41fba2f8956e","Type":"ContainerStarted","Data":"5272c88c127e6f1d00a8d96252848e9065e65bc4d7280f88ae2975bf23737483"} Mar 17 00:42:11 crc kubenswrapper[4755]: I0317 00:42:11.314965 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:42:11 crc kubenswrapper[4755]: I0317 00:42:11.344881 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" podStartSLOduration=2.992704595 podStartE2EDuration="37.344861481s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:41:36.334273063 +0000 UTC m=+1171.093725336" lastFinishedPulling="2026-03-17 00:42:10.686429929 +0000 UTC m=+1205.445882222" observedRunningTime="2026-03-17 00:42:11.339557594 +0000 UTC m=+1206.099009897" watchObservedRunningTime="2026-03-17 00:42:11.344861481 +0000 UTC m=+1206.104313764" Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.182400 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.296505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dpdz\" (UniqueName: \"kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz\") pod \"de73bb77-da64-4a58-bf81-d617192672f2\" (UID: \"de73bb77-da64-4a58-bf81-d617192672f2\") " Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.305471 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz" (OuterVolumeSpecName: "kube-api-access-6dpdz") pod "de73bb77-da64-4a58-bf81-d617192672f2" (UID: "de73bb77-da64-4a58-bf81-d617192672f2"). InnerVolumeSpecName "kube-api-access-6dpdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.336660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561802-thhsv" event={"ID":"de73bb77-da64-4a58-bf81-d617192672f2","Type":"ContainerDied","Data":"c218eb46cb9e60a6848513c2c0f2f21b5f0c27231e6c4210ea93686794bc1af7"} Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.336710 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c218eb46cb9e60a6848513c2c0f2f21b5f0c27231e6c4210ea93686794bc1af7" Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.336774 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561802-thhsv" Mar 17 00:42:13 crc kubenswrapper[4755]: I0317 00:42:13.398911 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dpdz\" (UniqueName: \"kubernetes.io/projected/de73bb77-da64-4a58-bf81-d617192672f2-kube-api-access-6dpdz\") on node \"crc\" DevicePath \"\"" Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.260919 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561796-jpjbh"] Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.269510 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561796-jpjbh"] Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.344988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" event={"ID":"c2f21978-13ea-4441-ba13-2be2beec2f0a","Type":"ContainerStarted","Data":"480efb8ce9a283ccc63cc0ec7924ab9bb057f651002cc4b74f8a9784c665d3e9"} Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.345140 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.346526 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" event={"ID":"fea3f6b7-c840-4795-8ca2-9dba15a49df1","Type":"ContainerStarted","Data":"cf1ed9d9009e143a286b4720f35c81ab0749b5d4babd3ee7a3154685e4858697"} Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.346682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.393934 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" podStartSLOduration=34.713961329 podStartE2EDuration="39.393904823s" podCreationTimestamp="2026-03-17 00:41:35 +0000 UTC" firstStartedPulling="2026-03-17 00:42:08.467082692 +0000 UTC m=+1203.226534975" lastFinishedPulling="2026-03-17 00:42:13.147026166 +0000 UTC m=+1207.906478469" observedRunningTime="2026-03-17 00:42:14.38159657 +0000 UTC m=+1209.141048873" watchObservedRunningTime="2026-03-17 00:42:14.393904823 +0000 UTC m=+1209.153357136" Mar 17 00:42:14 crc kubenswrapper[4755]: I0317 00:42:14.402449 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" podStartSLOduration=35.792050094 podStartE2EDuration="40.40241579s" podCreationTimestamp="2026-03-17 00:41:34 +0000 UTC" firstStartedPulling="2026-03-17 00:42:08.526986721 +0000 UTC m=+1203.286439004" lastFinishedPulling="2026-03-17 00:42:13.137352407 +0000 UTC m=+1207.896804700" observedRunningTime="2026-03-17 00:42:14.399031066 +0000 UTC m=+1209.158483349" watchObservedRunningTime="2026-03-17 00:42:14.40241579 +0000 UTC m=+1209.161868073" Mar 17 00:42:15 crc kubenswrapper[4755]: I0317 00:42:15.217507 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-kkr5x" Mar 17 00:42:16 crc kubenswrapper[4755]: I0317 00:42:16.262067 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c22c06-348a-44e8-9c8b-0e995aa82739" path="/var/lib/kubelet/pods/31c22c06-348a-44e8-9c8b-0e995aa82739/volumes" Mar 17 00:42:17 crc kubenswrapper[4755]: I0317 00:42:17.905906 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d474745d9-7q6lg" Mar 17 00:42:23 crc kubenswrapper[4755]: I0317 00:42:23.217373 4755 scope.go:117] "RemoveContainer" containerID="1005d371d254a562b58cfa0a91ef63c86197b144c9a8107e560d9391cbffdf93" Mar 17 00:42:27 crc kubenswrapper[4755]: I0317 00:42:27.051282 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-hx685" Mar 17 00:42:27 crc kubenswrapper[4755]: I0317 00:42:27.465319 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-kvfkt" Mar 17 00:42:28 crc kubenswrapper[4755]: I0317 00:42:28.665772 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:42:28 crc kubenswrapper[4755]: I0317 00:42:28.665865 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.124798 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:42:46 crc kubenswrapper[4755]: E0317 00:42:46.128847 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de73bb77-da64-4a58-bf81-d617192672f2" containerName="oc" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.128888 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="de73bb77-da64-4a58-bf81-d617192672f2" containerName="oc" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.129197 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="de73bb77-da64-4a58-bf81-d617192672f2" containerName="oc" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.130303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.138782 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.139229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dck64" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.139496 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.139700 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.151396 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.191294 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.193653 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.197556 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.201305 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.301197 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmw7v\" (UniqueName: \"kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.301259 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.301293 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h646z\" (UniqueName: \"kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.301564 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.301669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.403659 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.403722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.403788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmw7v\" (UniqueName: \"kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.403819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.403852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h646z\" (UniqueName: \"kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.404823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.404888 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.405324 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.421730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmw7v\" (UniqueName: \"kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v\") pod \"dnsmasq-dns-675f4bcbfc-lggjd\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.423246 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h646z\" (UniqueName: \"kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z\") pod \"dnsmasq-dns-78dd6ddcc-sbrv2\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.452021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.507307 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:42:46 crc kubenswrapper[4755]: I0317 00:42:46.932817 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:42:47 crc kubenswrapper[4755]: I0317 00:42:47.002133 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:42:47 crc kubenswrapper[4755]: W0317 00:42:47.007162 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ddc3f28_11c2_43c1_8b0d_1db18fadcde6.slice/crio-b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357 WatchSource:0}: Error finding container b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357: Status 404 returned error can't find the container with id b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357 Mar 17 00:42:47 crc kubenswrapper[4755]: I0317 00:42:47.699788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" event={"ID":"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6","Type":"ContainerStarted","Data":"b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357"} Mar 17 00:42:47 crc kubenswrapper[4755]: I0317 00:42:47.703043 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" event={"ID":"d79db2fa-5f14-408a-8823-db5237d68bd0","Type":"ContainerStarted","Data":"30f2d6dcea56d8d79aa49f85e78d3d83f34fb9efc66b59cf41fac46c545bd090"} Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.019355 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.055904 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.058032 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.073187 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.151271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.151341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.151638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbn8c\" (UniqueName: \"kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.254233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.254273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.254320 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbn8c\" (UniqueName: \"kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.255846 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.257175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.274484 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.293607 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbn8c\" (UniqueName: \"kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c\") pod \"dnsmasq-dns-5ccc8479f9-h5l8n\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.299986 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.301333 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.311115 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.379855 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.459184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.459243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnz7\" (UniqueName: \"kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.459297 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.571558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.571623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnz7\" (UniqueName: \"kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.571692 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.572526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.572522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.591987 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnz7\" (UniqueName: \"kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7\") pod \"dnsmasq-dns-57d769cc4f-26pzj\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.624357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:42:49 crc kubenswrapper[4755]: I0317 00:42:49.888507 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:42:49 crc kubenswrapper[4755]: W0317 00:42:49.906737 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bd6657d_d56c_45e7_92cb_ece9a3890c85.slice/crio-1f44cbbbcd7631fa47615a3031abceed8d764dc492ff85acd51b023a111b1234 WatchSource:0}: Error finding container 1f44cbbbcd7631fa47615a3031abceed8d764dc492ff85acd51b023a111b1234: Status 404 returned error can't find the container with id 1f44cbbbcd7631fa47615a3031abceed8d764dc492ff85acd51b023a111b1234 Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.044951 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:42:50 crc kubenswrapper[4755]: W0317 00:42:50.048837 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d55848a_c15d_4ed3_899b_bfcbb45f13ff.slice/crio-3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098 WatchSource:0}: Error finding container 3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098: Status 404 returned error can't find the container with id 3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098 Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.172378 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.181855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.181865 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.184685 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.184835 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.185073 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.185231 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.185332 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.185489 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xxj7x" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.188108 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282595 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282627 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282776 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282849 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmvb\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.282950 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.283054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384652 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmvb\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.384985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.385028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.387221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.387615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.387916 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.390194 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.390883 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.394506 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.395088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.395262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.410263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.421086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmvb\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.438000 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.443476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.474496 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.476045 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.481244 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.481555 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.481679 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7hx8r" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.481800 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.481961 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.482110 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.488986 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.502938 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.508824 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595296 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595352 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595380 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbnb7\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595473 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595489 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595513 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.595531 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.696416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697089 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697156 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697299 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697329 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbnb7\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.697489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.698177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.698507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.698797 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.699631 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.699701 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.699725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.701901 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.714088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.718007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.720691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.721286 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbnb7\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.736981 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " pod="openstack/rabbitmq-server-0" Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.777235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" event={"ID":"6bd6657d-d56c-45e7-92cb-ece9a3890c85","Type":"ContainerStarted","Data":"1f44cbbbcd7631fa47615a3031abceed8d764dc492ff85acd51b023a111b1234"} Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.778784 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" event={"ID":"6d55848a-c15d-4ed3-899b-bfcbb45f13ff","Type":"ContainerStarted","Data":"3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098"} Mar 17 00:42:50 crc kubenswrapper[4755]: I0317 00:42:50.807654 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.082320 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.554733 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.556153 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.560406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.560687 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.561267 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.573389 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-f7wg6" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.574328 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.574745 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-default\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719107 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8j7r\" (UniqueName: \"kubernetes.io/projected/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kube-api-access-x8j7r\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719222 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719281 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kolla-config\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.719314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.820895 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-default\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.820970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821007 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8j7r\" (UniqueName: \"kubernetes.io/projected/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kube-api-access-x8j7r\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821053 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kolla-config\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.821187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.822035 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-default\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.822735 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.822944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kolla-config\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.823338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.842248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.878032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.879632 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.891882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8j7r\" (UniqueName: \"kubernetes.io/projected/e48be2ab-6e3e-4a75-b47e-e700bd4126f1-kube-api-access-x8j7r\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:51 crc kubenswrapper[4755]: I0317 00:42:51.899491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"e48be2ab-6e3e-4a75-b47e-e700bd4126f1\") " pod="openstack/openstack-galera-0" Mar 17 00:42:52 crc kubenswrapper[4755]: I0317 00:42:52.203666 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 00:42:52 crc kubenswrapper[4755]: I0317 00:42:52.988650 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 00:42:52 crc kubenswrapper[4755]: I0317 00:42:52.990409 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.003180 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6t6jp" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.003592 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.003838 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.004048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.015472 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.042921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw6d6\" (UniqueName: \"kubernetes.io/projected/dfea0511-e194-48c8-8795-58d07ada5d4c-kube-api-access-bw6d6\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.042982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043009 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.043184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw6d6\" (UniqueName: \"kubernetes.io/projected/dfea0511-e194-48c8-8795-58d07ada5d4c-kube-api-access-bw6d6\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144794 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144853 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.144897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.145604 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.145700 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.145945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.146880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.148806 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.149975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea0511-e194-48c8-8795-58d07ada5d4c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.161036 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dfea0511-e194-48c8-8795-58d07ada5d4c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.165578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw6d6\" (UniqueName: \"kubernetes.io/projected/dfea0511-e194-48c8-8795-58d07ada5d4c-kube-api-access-bw6d6\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.167094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"dfea0511-e194-48c8-8795-58d07ada5d4c\") " pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.230008 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.231061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.232884 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.233456 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-km828" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.233676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.247944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.353203 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.354623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-config-data\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.354694 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.354721 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnt47\" (UniqueName: \"kubernetes.io/projected/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kube-api-access-pnt47\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.354792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kolla-config\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.354825 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.455862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.456144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-config-data\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.456288 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.456429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnt47\" (UniqueName: \"kubernetes.io/projected/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kube-api-access-pnt47\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.456669 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kolla-config\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.457188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-config-data\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.457567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kolla-config\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.460515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.474431 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.476093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnt47\" (UniqueName: \"kubernetes.io/projected/cf824bbf-6a94-4505-a9cb-67e9e394f2e1-kube-api-access-pnt47\") pod \"memcached-0\" (UID: \"cf824bbf-6a94-4505-a9cb-67e9e394f2e1\") " pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.549455 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 00:42:53 crc kubenswrapper[4755]: W0317 00:42:53.579223 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3bb7d6_8094_4cef_a05b_6bad26c2d14a.slice/crio-0e0a99034e2a6c444ebe3cb56083f1b11c8af84a6f3f10be481b79351ce70dad WatchSource:0}: Error finding container 0e0a99034e2a6c444ebe3cb56083f1b11c8af84a6f3f10be481b79351ce70dad: Status 404 returned error can't find the container with id 0e0a99034e2a6c444ebe3cb56083f1b11c8af84a6f3f10be481b79351ce70dad Mar 17 00:42:53 crc kubenswrapper[4755]: I0317 00:42:53.815074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerStarted","Data":"0e0a99034e2a6c444ebe3cb56083f1b11c8af84a6f3f10be481b79351ce70dad"} Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.429904 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.431170 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.433545 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4bt5k" Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.442002 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.502141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfb2g\" (UniqueName: \"kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g\") pod \"kube-state-metrics-0\" (UID: \"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820\") " pod="openstack/kube-state-metrics-0" Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.606739 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfb2g\" (UniqueName: \"kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g\") pod \"kube-state-metrics-0\" (UID: \"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820\") " pod="openstack/kube-state-metrics-0" Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.644252 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfb2g\" (UniqueName: \"kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g\") pod \"kube-state-metrics-0\" (UID: \"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820\") " pod="openstack/kube-state-metrics-0" Mar 17 00:42:55 crc kubenswrapper[4755]: I0317 00:42:55.789817 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.337782 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.339122 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.342351 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.342536 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-gfkm7" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.364647 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.428334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5tg\" (UniqueName: \"kubernetes.io/projected/96eaea54-65d7-475c-8d91-45ba95bd547a-kube-api-access-7h5tg\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.428488 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.529411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.529883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5tg\" (UniqueName: \"kubernetes.io/projected/96eaea54-65d7-475c-8d91-45ba95bd547a-kube-api-access-7h5tg\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: E0317 00:42:56.529602 4755 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 17 00:42:56 crc kubenswrapper[4755]: E0317 00:42:56.530006 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert podName:96eaea54-65d7-475c-8d91-45ba95bd547a nodeName:}" failed. No retries permitted until 2026-03-17 00:42:57.029984511 +0000 UTC m=+1251.789436904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert") pod "observability-ui-dashboards-66cbf594b5-2vdc8" (UID: "96eaea54-65d7-475c-8d91-45ba95bd547a") : secret "observability-ui-dashboards" not found Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.563986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5tg\" (UniqueName: \"kubernetes.io/projected/96eaea54-65d7-475c-8d91-45ba95bd547a-kube-api-access-7h5tg\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.678051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5674c8b58d-5n224"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.680739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.730508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5674c8b58d-5n224"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.733907 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-oauth-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.733948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-trusted-ca-bundle\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.733988 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv4vw\" (UniqueName: \"kubernetes.io/projected/8c977460-63da-4cbb-a903-233e414f6bde-kube-api-access-tv4vw\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.734026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-console-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.734111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.734132 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-service-ca\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.734172 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-oauth-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.810327 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.816114 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.819270 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.819319 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.820483 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kdfvt" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.820605 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.820694 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.820735 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.826544 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.827042 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.830393 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837292 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-service-ca\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-oauth-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-oauth-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-trusted-ca-bundle\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837517 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv4vw\" (UniqueName: \"kubernetes.io/projected/8c977460-63da-4cbb-a903-233e414f6bde-kube-api-access-tv4vw\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.837568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-console-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.838905 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-oauth-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.839384 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-service-ca\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.839629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-trusted-ca-bundle\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.853287 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-serving-cert\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.855248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c977460-63da-4cbb-a903-233e414f6bde-console-oauth-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.855629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c977460-63da-4cbb-a903-233e414f6bde-console-config\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.870574 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv4vw\" (UniqueName: \"kubernetes.io/projected/8c977460-63da-4cbb-a903-233e414f6bde-kube-api-access-tv4vw\") pod \"console-5674c8b58d-5n224\" (UID: \"8c977460-63da-4cbb-a903-233e414f6bde\") " pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939825 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939959 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.939976 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.940023 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.940059 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.940099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5vc\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:56 crc kubenswrapper[4755]: I0317 00:42:56.940117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.026825 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.041999 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042096 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042177 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5vc\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.042370 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.043224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.046238 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.047420 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.050406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.050728 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.051106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.057227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.057663 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.058041 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96eaea54-65d7-475c-8d91-45ba95bd547a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-2vdc8\" (UID: \"96eaea54-65d7-475c-8d91-45ba95bd547a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.072833 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5vc\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.076628 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.145616 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:42:57 crc kubenswrapper[4755]: I0317 00:42:57.267377 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.533643 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dvvpc"] Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.534889 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.536857 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rsjft" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.537127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.537305 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.546313 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bdvbb"] Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.548417 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.560908 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dvvpc"] Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.578475 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bdvbb"] Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.664720 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.664794 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.664851 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.665601 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.665663 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5" gracePeriod=600 Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-ovn-controller-tls-certs\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681220 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681247 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-log\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6baf9f03-ea25-4498-9999-2ae741ba0b3a-scripts\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-scripts\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rw9\" (UniqueName: \"kubernetes.io/projected/6baf9f03-ea25-4498-9999-2ae741ba0b3a-kube-api-access-27rw9\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681391 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-run\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681429 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-log-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681499 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cddg9\" (UniqueName: \"kubernetes.io/projected/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-kube-api-access-cddg9\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681516 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-etc-ovs\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681579 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-combined-ca-bundle\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.681605 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-lib\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783258 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-ovn-controller-tls-certs\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-log\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6baf9f03-ea25-4498-9999-2ae741ba0b3a-scripts\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-scripts\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27rw9\" (UniqueName: \"kubernetes.io/projected/6baf9f03-ea25-4498-9999-2ae741ba0b3a-kube-api-access-27rw9\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-run\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-log-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783509 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cddg9\" (UniqueName: \"kubernetes.io/projected/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-kube-api-access-cddg9\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-etc-ovs\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783563 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-combined-ca-bundle\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783583 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-lib\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-log\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.783922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.784077 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-etc-ovs\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.784087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-log-ovn\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.784271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-lib\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.785668 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6baf9f03-ea25-4498-9999-2ae741ba0b3a-var-run\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.785735 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-var-run\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.785933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6baf9f03-ea25-4498-9999-2ae741ba0b3a-scripts\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.786975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-scripts\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.787764 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-combined-ca-bundle\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.788593 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-ovn-controller-tls-certs\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.799191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rw9\" (UniqueName: \"kubernetes.io/projected/6baf9f03-ea25-4498-9999-2ae741ba0b3a-kube-api-access-27rw9\") pod \"ovn-controller-ovs-bdvbb\" (UID: \"6baf9f03-ea25-4498-9999-2ae741ba0b3a\") " pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.805185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cddg9\" (UniqueName: \"kubernetes.io/projected/a6eae7bd-5007-4389-b4ab-7f296d0fa9ce-kube-api-access-cddg9\") pod \"ovn-controller-dvvpc\" (UID: \"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce\") " pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.874815 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc" Mar 17 00:42:58 crc kubenswrapper[4755]: I0317 00:42:58.882346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.170818 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.172900 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.175669 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.176148 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.176649 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.176785 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-np599" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.177002 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.181267 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.309136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.309203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.309917 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.310027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-config\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.310253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.310303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.310355 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.310582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxxkz\" (UniqueName: \"kubernetes.io/projected/54c4fe64-c4f8-4e77-9029-946580816bf7-kube-api-access-vxxkz\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412032 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxxkz\" (UniqueName: \"kubernetes.io/projected/54c4fe64-c4f8-4e77-9029-946580816bf7-kube-api-access-vxxkz\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412277 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-config\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412905 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.412980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.413492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-config\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.413524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c4fe64-c4f8-4e77-9029-946580816bf7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.422303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.422647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.424486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c4fe64-c4f8-4e77-9029-946580816bf7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.435070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxxkz\" (UniqueName: \"kubernetes.io/projected/54c4fe64-c4f8-4e77-9029-946580816bf7-kube-api-access-vxxkz\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.437641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"54c4fe64-c4f8-4e77-9029-946580816bf7\") " pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.486336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.871020 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5" exitCode=0 Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.871063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5"} Mar 17 00:42:59 crc kubenswrapper[4755]: I0317 00:42:59.871093 4755 scope.go:117] "RemoveContainer" containerID="68018dade804aadf96db21752f85fdf1b74e75774cca2b6cfb117db003750ae0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.719456 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.721246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.724601 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.724675 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.725305 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.725814 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fp664" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.736208 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.874672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqfh\" (UniqueName: \"kubernetes.io/projected/d6e8532d-a845-4882-a690-09c072e39311-kube-api-access-pxqfh\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875052 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6e8532d-a845-4882-a690-09c072e39311-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875283 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875392 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875471 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875539 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.875668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977847 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqfh\" (UniqueName: \"kubernetes.io/projected/d6e8532d-a845-4882-a690-09c072e39311-kube-api-access-pxqfh\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.977991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6e8532d-a845-4882-a690-09c072e39311-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.978010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.978208 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.979242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.979616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6e8532d-a845-4882-a690-09c072e39311-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.979676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e8532d-a845-4882-a690-09c072e39311-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.983814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.985197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:02 crc kubenswrapper[4755]: I0317 00:43:02.991661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8532d-a845-4882-a690-09c072e39311-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:03 crc kubenswrapper[4755]: I0317 00:43:02.998941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqfh\" (UniqueName: \"kubernetes.io/projected/d6e8532d-a845-4882-a690-09c072e39311-kube-api-access-pxqfh\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:03 crc kubenswrapper[4755]: I0317 00:43:03.001258 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6e8532d-a845-4882-a690-09c072e39311\") " pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:03 crc kubenswrapper[4755]: I0317 00:43:03.087875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:04 crc kubenswrapper[4755]: I0317 00:43:04.355802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.700356 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.701043 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmw7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-lggjd_openstack(d79db2fa-5f14-408a-8823-db5237d68bd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.702583 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" podUID="d79db2fa-5f14-408a-8823-db5237d68bd0" Mar 17 00:43:05 crc kubenswrapper[4755]: W0317 00:43:05.727341 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890b1d99_1a82_424e_981b_5c8ea1ae26ee.slice/crio-7efff3d296e109a619ecefc647a2b5bc977f4f5dba6b780965f75378e5de7816 WatchSource:0}: Error finding container 7efff3d296e109a619ecefc647a2b5bc977f4f5dba6b780965f75378e5de7816: Status 404 returned error can't find the container with id 7efff3d296e109a619ecefc647a2b5bc977f4f5dba6b780965f75378e5de7816 Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.766574 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.766850 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h646z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sbrv2_openstack(7ddc3f28-11c2-43c1-8b0d-1db18fadcde6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:43:05 crc kubenswrapper[4755]: E0317 00:43:05.768496 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" podUID="7ddc3f28-11c2-43c1-8b0d-1db18fadcde6" Mar 17 00:43:05 crc kubenswrapper[4755]: I0317 00:43:05.941648 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerStarted","Data":"7efff3d296e109a619ecefc647a2b5bc977f4f5dba6b780965f75378e5de7816"} Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.505948 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 00:43:06 crc kubenswrapper[4755]: W0317 00:43:06.564350 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf824bbf_6a94_4505_a9cb_67e9e394f2e1.slice/crio-3fd8d366dceea589605758ba9e6c5e17c56dad5931cb00fedc7e4458e8f7f231 WatchSource:0}: Error finding container 3fd8d366dceea589605758ba9e6c5e17c56dad5931cb00fedc7e4458e8f7f231: Status 404 returned error can't find the container with id 3fd8d366dceea589605758ba9e6c5e17c56dad5931cb00fedc7e4458e8f7f231 Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.860635 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.871574 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.882073 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.899468 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 00:43:06 crc kubenswrapper[4755]: W0317 00:43:06.957687 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3ff4b6_bb9e_446c_9221_f80d9ec5a820.slice/crio-3afa8b0e1e0de7f9825062f6929f3d4ddccef531465d134f26ca1e57a7a80516 WatchSource:0}: Error finding container 3afa8b0e1e0de7f9825062f6929f3d4ddccef531465d134f26ca1e57a7a80516: Status 404 returned error can't find the container with id 3afa8b0e1e0de7f9825062f6929f3d4ddccef531465d134f26ca1e57a7a80516 Mar 17 00:43:06 crc kubenswrapper[4755]: W0317 00:43:06.961609 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfea0511_e194_48c8_8795_58d07ada5d4c.slice/crio-d39460049af4802a7067bdca38bc23422dfc1b3afb0cc08fe1888a27917b136e WatchSource:0}: Error finding container d39460049af4802a7067bdca38bc23422dfc1b3afb0cc08fe1888a27917b136e: Status 404 returned error can't find the container with id d39460049af4802a7067bdca38bc23422dfc1b3afb0cc08fe1888a27917b136e Mar 17 00:43:06 crc kubenswrapper[4755]: W0317 00:43:06.968644 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48be2ab_6e3e_4a75_b47e_e700bd4126f1.slice/crio-010c7824aacd06100aa4ae83269ccafb9066fc395ce3cb53aae8536d6fb6c003 WatchSource:0}: Error finding container 010c7824aacd06100aa4ae83269ccafb9066fc395ce3cb53aae8536d6fb6c003: Status 404 returned error can't find the container with id 010c7824aacd06100aa4ae83269ccafb9066fc395ce3cb53aae8536d6fb6c003 Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.998150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" event={"ID":"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6","Type":"ContainerDied","Data":"b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357"} Mar 17 00:43:06 crc kubenswrapper[4755]: I0317 00:43:06.998191 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2fb95a4b0029dcc831af1a7b778b69e6918c455e37cd179848e2fefcb966357" Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.001077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820","Type":"ContainerStarted","Data":"3afa8b0e1e0de7f9825062f6929f3d4ddccef531465d134f26ca1e57a7a80516"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.003228 4755 generic.go:334] "Generic (PLEG): container finished" podID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerID="612f7455ff7578d4d95399b400bf7c96259201c7bdbadb678988797a29a278bf" exitCode=0 Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.003277 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" event={"ID":"6bd6657d-d56c-45e7-92cb-ece9a3890c85","Type":"ContainerDied","Data":"612f7455ff7578d4d95399b400bf7c96259201c7bdbadb678988797a29a278bf"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.005682 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dfea0511-e194-48c8-8795-58d07ada5d4c","Type":"ContainerStarted","Data":"d39460049af4802a7067bdca38bc23422dfc1b3afb0cc08fe1888a27917b136e"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.008351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.012405 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e48be2ab-6e3e-4a75-b47e-e700bd4126f1","Type":"ContainerStarted","Data":"010c7824aacd06100aa4ae83269ccafb9066fc395ce3cb53aae8536d6fb6c003"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.014199 4755 generic.go:334] "Generic (PLEG): container finished" podID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerID="c2ddae9046f6a980f7216acbb3c9c0eaac0b1d3202ca84e39004761fce6b9145" exitCode=0 Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.014267 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" event={"ID":"6d55848a-c15d-4ed3-899b-bfcbb45f13ff","Type":"ContainerDied","Data":"c2ddae9046f6a980f7216acbb3c9c0eaac0b1d3202ca84e39004761fce6b9145"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.016047 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerStarted","Data":"8b372c42c9e3c0d14bd0e7394e6abea5ec5c989461adbfcfb98cdde4104db347"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.020879 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf824bbf-6a94-4505-a9cb-67e9e394f2e1","Type":"ContainerStarted","Data":"3fd8d366dceea589605758ba9e6c5e17c56dad5931cb00fedc7e4458e8f7f231"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.030149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" event={"ID":"d79db2fa-5f14-408a-8823-db5237d68bd0","Type":"ContainerDied","Data":"30f2d6dcea56d8d79aa49f85e78d3d83f34fb9efc66b59cf41fac46c545bd090"} Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.030208 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f2d6dcea56d8d79aa49f85e78d3d83f34fb9efc66b59cf41fac46c545bd090" Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.120895 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5674c8b58d-5n224"] Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.127799 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8"] Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.177686 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dvvpc"] Mar 17 00:43:07 crc kubenswrapper[4755]: W0317 00:43:07.193329 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c977460_63da_4cbb_a903_233e414f6bde.slice/crio-701c32c39a58df9df5bf36c1967b39eda355bb8151f0388defc245b017609395 WatchSource:0}: Error finding container 701c32c39a58df9df5bf36c1967b39eda355bb8151f0388defc245b017609395: Status 404 returned error can't find the container with id 701c32c39a58df9df5bf36c1967b39eda355bb8151f0388defc245b017609395 Mar 17 00:43:07 crc kubenswrapper[4755]: W0317 00:43:07.194069 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6eae7bd_5007_4389_b4ab_7f296d0fa9ce.slice/crio-8cde2e552dc6c72a59c17851537592cecda86251c674b304e4a50010f46b5bd0 WatchSource:0}: Error finding container 8cde2e552dc6c72a59c17851537592cecda86251c674b304e4a50010f46b5bd0: Status 404 returned error can't find the container with id 8cde2e552dc6c72a59c17851537592cecda86251c674b304e4a50010f46b5bd0 Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.384230 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 00:43:07 crc kubenswrapper[4755]: W0317 00:43:07.393403 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c4fe64_c4f8_4e77_9029_946580816bf7.slice/crio-773fb2d8fefc97997bc5d04c56f77cace30ef6200b9735ed3e287ba8662d041a WatchSource:0}: Error finding container 773fb2d8fefc97997bc5d04c56f77cace30ef6200b9735ed3e287ba8662d041a: Status 404 returned error can't find the container with id 773fb2d8fefc97997bc5d04c56f77cace30ef6200b9735ed3e287ba8662d041a Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.876146 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.903445 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.989730 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmw7v\" (UniqueName: \"kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v\") pod \"d79db2fa-5f14-408a-8823-db5237d68bd0\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.989929 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config\") pod \"d79db2fa-5f14-408a-8823-db5237d68bd0\" (UID: \"d79db2fa-5f14-408a-8823-db5237d68bd0\") " Mar 17 00:43:07 crc kubenswrapper[4755]: I0317 00:43:07.992109 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config" (OuterVolumeSpecName: "config") pod "d79db2fa-5f14-408a-8823-db5237d68bd0" (UID: "d79db2fa-5f14-408a-8823-db5237d68bd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.017792 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v" (OuterVolumeSpecName: "kube-api-access-tmw7v") pod "d79db2fa-5f14-408a-8823-db5237d68bd0" (UID: "d79db2fa-5f14-408a-8823-db5237d68bd0"). InnerVolumeSpecName "kube-api-access-tmw7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.039365 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dvvpc" event={"ID":"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce","Type":"ContainerStarted","Data":"8cde2e552dc6c72a59c17851537592cecda86251c674b304e4a50010f46b5bd0"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.047615 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" event={"ID":"6d55848a-c15d-4ed3-899b-bfcbb45f13ff","Type":"ContainerStarted","Data":"bba9337a0a86997f5b8df36258f89b93c8084d68107a2458fb0b8a3c78015f50"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.047743 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.050663 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" event={"ID":"96eaea54-65d7-475c-8d91-45ba95bd547a","Type":"ContainerStarted","Data":"923b824bfa6be6478e8b10a6daf77ee58cfe441bbb259664146b0b633b6d71d5"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.053538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerStarted","Data":"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.056969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" event={"ID":"6bd6657d-d56c-45e7-92cb-ece9a3890c85","Type":"ContainerStarted","Data":"970b3add88dfdbda14f56acc6a91e5fbe722341317eefb706a1f9dc9547c5387"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.058265 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.060184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"54c4fe64-c4f8-4e77-9029-946580816bf7","Type":"ContainerStarted","Data":"773fb2d8fefc97997bc5d04c56f77cace30ef6200b9735ed3e287ba8662d041a"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.065021 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lggjd" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.066331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5674c8b58d-5n224" event={"ID":"8c977460-63da-4cbb-a903-233e414f6bde","Type":"ContainerStarted","Data":"28cf9944113462d992c8f7d2f58902b3223b2cf546e29f299b4be9e525b79cd1"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.066405 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5674c8b58d-5n224" event={"ID":"8c977460-63da-4cbb-a903-233e414f6bde","Type":"ContainerStarted","Data":"701c32c39a58df9df5bf36c1967b39eda355bb8151f0388defc245b017609395"} Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.066594 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sbrv2" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.083974 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" podStartSLOduration=3.208580204 podStartE2EDuration="19.083950144s" podCreationTimestamp="2026-03-17 00:42:49 +0000 UTC" firstStartedPulling="2026-03-17 00:42:50.053576316 +0000 UTC m=+1244.813028599" lastFinishedPulling="2026-03-17 00:43:05.928946246 +0000 UTC m=+1260.688398539" observedRunningTime="2026-03-17 00:43:08.066765383 +0000 UTC m=+1262.826217666" watchObservedRunningTime="2026-03-17 00:43:08.083950144 +0000 UTC m=+1262.843402427" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.091867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config\") pod \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.091914 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h646z\" (UniqueName: \"kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z\") pod \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.091977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc\") pod \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\" (UID: \"7ddc3f28-11c2-43c1-8b0d-1db18fadcde6\") " Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.096889 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config" (OuterVolumeSpecName: "config") pod "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6" (UID: "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.096947 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmw7v\" (UniqueName: \"kubernetes.io/projected/d79db2fa-5f14-408a-8823-db5237d68bd0-kube-api-access-tmw7v\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.096961 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d79db2fa-5f14-408a-8823-db5237d68bd0-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.113272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z" (OuterVolumeSpecName: "kube-api-access-h646z") pod "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6" (UID: "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6"). InnerVolumeSpecName "kube-api-access-h646z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.113805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6" (UID: "7ddc3f28-11c2-43c1-8b0d-1db18fadcde6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.131272 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bdvbb"] Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.135949 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5674c8b58d-5n224" podStartSLOduration=12.135931079 podStartE2EDuration="12.135931079s" podCreationTimestamp="2026-03-17 00:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:43:08.098043174 +0000 UTC m=+1262.857495477" watchObservedRunningTime="2026-03-17 00:43:08.135931079 +0000 UTC m=+1262.895383362" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.156960 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" podStartSLOduration=3.139566142 podStartE2EDuration="19.156941801s" podCreationTimestamp="2026-03-17 00:42:49 +0000 UTC" firstStartedPulling="2026-03-17 00:42:49.91018007 +0000 UTC m=+1244.669632353" lastFinishedPulling="2026-03-17 00:43:05.927555719 +0000 UTC m=+1260.687008012" observedRunningTime="2026-03-17 00:43:08.121303055 +0000 UTC m=+1262.880755338" watchObservedRunningTime="2026-03-17 00:43:08.156941801 +0000 UTC m=+1262.916394084" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.199760 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.199787 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h646z\" (UniqueName: \"kubernetes.io/projected/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-kube-api-access-h646z\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.199838 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.219896 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.231913 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lggjd"] Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.258297 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79db2fa-5f14-408a-8823-db5237d68bd0" path="/var/lib/kubelet/pods/d79db2fa-5f14-408a-8823-db5237d68bd0/volumes" Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.414272 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.421254 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sbrv2"] Mar 17 00:43:08 crc kubenswrapper[4755]: I0317 00:43:08.467831 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 00:43:09 crc kubenswrapper[4755]: W0317 00:43:09.041848 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6e8532d_a845_4882_a690_09c072e39311.slice/crio-99aa6fcc3cbb065ae250ae14403a7f0bfe12caf4f866d3bf81a4fc35178ddfd5 WatchSource:0}: Error finding container 99aa6fcc3cbb065ae250ae14403a7f0bfe12caf4f866d3bf81a4fc35178ddfd5: Status 404 returned error can't find the container with id 99aa6fcc3cbb065ae250ae14403a7f0bfe12caf4f866d3bf81a4fc35178ddfd5 Mar 17 00:43:09 crc kubenswrapper[4755]: I0317 00:43:09.090381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdvbb" event={"ID":"6baf9f03-ea25-4498-9999-2ae741ba0b3a","Type":"ContainerStarted","Data":"458d412fec69ed6d3f62dae939def9ba03c764f791dac93513435d7012cc4901"} Mar 17 00:43:09 crc kubenswrapper[4755]: I0317 00:43:09.091991 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6e8532d-a845-4882-a690-09c072e39311","Type":"ContainerStarted","Data":"99aa6fcc3cbb065ae250ae14403a7f0bfe12caf4f866d3bf81a4fc35178ddfd5"} Mar 17 00:43:09 crc kubenswrapper[4755]: I0317 00:43:09.094676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerStarted","Data":"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750"} Mar 17 00:43:10 crc kubenswrapper[4755]: I0317 00:43:10.258677 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddc3f28-11c2-43c1-8b0d-1db18fadcde6" path="/var/lib/kubelet/pods/7ddc3f28-11c2-43c1-8b0d-1db18fadcde6/volumes" Mar 17 00:43:14 crc kubenswrapper[4755]: I0317 00:43:14.381585 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:43:14 crc kubenswrapper[4755]: I0317 00:43:14.626602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:43:14 crc kubenswrapper[4755]: I0317 00:43:14.680547 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:43:15 crc kubenswrapper[4755]: I0317 00:43:15.170294 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="dnsmasq-dns" containerID="cri-o://970b3add88dfdbda14f56acc6a91e5fbe722341317eefb706a1f9dc9547c5387" gracePeriod=10 Mar 17 00:43:16 crc kubenswrapper[4755]: I0317 00:43:16.184042 4755 generic.go:334] "Generic (PLEG): container finished" podID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerID="970b3add88dfdbda14f56acc6a91e5fbe722341317eefb706a1f9dc9547c5387" exitCode=0 Mar 17 00:43:16 crc kubenswrapper[4755]: I0317 00:43:16.184168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" event={"ID":"6bd6657d-d56c-45e7-92cb-ece9a3890c85","Type":"ContainerDied","Data":"970b3add88dfdbda14f56acc6a91e5fbe722341317eefb706a1f9dc9547c5387"} Mar 17 00:43:17 crc kubenswrapper[4755]: I0317 00:43:17.027634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:43:17 crc kubenswrapper[4755]: I0317 00:43:17.028241 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:43:17 crc kubenswrapper[4755]: I0317 00:43:17.049504 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:43:17 crc kubenswrapper[4755]: I0317 00:43:17.199941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5674c8b58d-5n224" Mar 17 00:43:17 crc kubenswrapper[4755]: I0317 00:43:17.261758 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.019141 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.109324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbn8c\" (UniqueName: \"kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c\") pod \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.109487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc\") pod \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.109570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config\") pod \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\" (UID: \"6bd6657d-d56c-45e7-92cb-ece9a3890c85\") " Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.112603 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c" (OuterVolumeSpecName: "kube-api-access-lbn8c") pod "6bd6657d-d56c-45e7-92cb-ece9a3890c85" (UID: "6bd6657d-d56c-45e7-92cb-ece9a3890c85"). InnerVolumeSpecName "kube-api-access-lbn8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.144959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bd6657d-d56c-45e7-92cb-ece9a3890c85" (UID: "6bd6657d-d56c-45e7-92cb-ece9a3890c85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.156353 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config" (OuterVolumeSpecName: "config") pod "6bd6657d-d56c-45e7-92cb-ece9a3890c85" (UID: "6bd6657d-d56c-45e7-92cb-ece9a3890c85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.211256 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbn8c\" (UniqueName: \"kubernetes.io/projected/6bd6657d-d56c-45e7-92cb-ece9a3890c85-kube-api-access-lbn8c\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.211291 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.211322 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd6657d-d56c-45e7-92cb-ece9a3890c85-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.213582 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.213595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h5l8n" event={"ID":"6bd6657d-d56c-45e7-92cb-ece9a3890c85","Type":"ContainerDied","Data":"1f44cbbbcd7631fa47615a3031abceed8d764dc492ff85acd51b023a111b1234"} Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.213684 4755 scope.go:117] "RemoveContainer" containerID="970b3add88dfdbda14f56acc6a91e5fbe722341317eefb706a1f9dc9547c5387" Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.245681 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.265538 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h5l8n"] Mar 17 00:43:18 crc kubenswrapper[4755]: I0317 00:43:18.660007 4755 scope.go:117] "RemoveContainer" containerID="612f7455ff7578d4d95399b400bf7c96259201c7bdbadb678988797a29a278bf" Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.222041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdvbb" event={"ID":"6baf9f03-ea25-4498-9999-2ae741ba0b3a","Type":"ContainerStarted","Data":"73ef5489a2fa1b6f9b3598a0606c45ff53ba4eb9e97ae4fe13b39f198dc380d5"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.223802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820","Type":"ContainerStarted","Data":"350c95c21bdae52d5cd2e115c88bacec66c9dc68f94bad75c046d44a0d4c92bc"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.224231 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.225670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6e8532d-a845-4882-a690-09c072e39311","Type":"ContainerStarted","Data":"b72ebaea0b160427be5a6e2293e2804da92bfe33c25937f1192957549bed7c54"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.226756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf824bbf-6a94-4505-a9cb-67e9e394f2e1","Type":"ContainerStarted","Data":"0dbf8bad7a684d837983b89dd5896276fe0ef1c91072832bc30d287d34f8490e"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.226822 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.227791 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e48be2ab-6e3e-4a75-b47e-e700bd4126f1","Type":"ContainerStarted","Data":"569c481101010ad0798e9bd6deb937b987a9cabdd45285e784be7fbe49daada5"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.228745 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" event={"ID":"96eaea54-65d7-475c-8d91-45ba95bd547a","Type":"ContainerStarted","Data":"a453d41fcfcabfa1ec085df8e76fde3168c76121ef936871e2b2abfc47c53ec0"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.232565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"54c4fe64-c4f8-4e77-9029-946580816bf7","Type":"ContainerStarted","Data":"0cfe58963daea8cb6bd4c1c7abe760ca3b5414e5e0fb1d3c047009ae4d1f5579"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.233711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dfea0511-e194-48c8-8795-58d07ada5d4c","Type":"ContainerStarted","Data":"7c151d708a9bba8c87a9099369994bedd5c424ac7969ecfc32eb4126f42b590a"} Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.312733 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.644583849 podStartE2EDuration="26.312716184s" podCreationTimestamp="2026-03-17 00:42:53 +0000 UTC" firstStartedPulling="2026-03-17 00:43:06.568682191 +0000 UTC m=+1261.328134484" lastFinishedPulling="2026-03-17 00:43:10.236814536 +0000 UTC m=+1264.996266819" observedRunningTime="2026-03-17 00:43:19.311206825 +0000 UTC m=+1274.070659108" watchObservedRunningTime="2026-03-17 00:43:19.312716184 +0000 UTC m=+1274.072168467" Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.314548 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-2vdc8" podStartSLOduration=12.599877437 podStartE2EDuration="23.314540872s" podCreationTimestamp="2026-03-17 00:42:56 +0000 UTC" firstStartedPulling="2026-03-17 00:43:07.202987413 +0000 UTC m=+1261.962439686" lastFinishedPulling="2026-03-17 00:43:17.917650838 +0000 UTC m=+1272.677103121" observedRunningTime="2026-03-17 00:43:19.293410657 +0000 UTC m=+1274.052862930" watchObservedRunningTime="2026-03-17 00:43:19.314540872 +0000 UTC m=+1274.073993155" Mar 17 00:43:19 crc kubenswrapper[4755]: I0317 00:43:19.371166 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.499190632 podStartE2EDuration="24.371140728s" podCreationTimestamp="2026-03-17 00:42:55 +0000 UTC" firstStartedPulling="2026-03-17 00:43:06.959879966 +0000 UTC m=+1261.719332269" lastFinishedPulling="2026-03-17 00:43:18.831830082 +0000 UTC m=+1273.591282365" observedRunningTime="2026-03-17 00:43:19.347815366 +0000 UTC m=+1274.107267649" watchObservedRunningTime="2026-03-17 00:43:19.371140728 +0000 UTC m=+1274.130593011" Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.244961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dvvpc" event={"ID":"a6eae7bd-5007-4389-b4ab-7f296d0fa9ce","Type":"ContainerStarted","Data":"64408dd325f4f985393030defe58fd4036da6a2f34a93befb17537436e024497"} Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.245343 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dvvpc" Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.248871 4755 generic.go:334] "Generic (PLEG): container finished" podID="6baf9f03-ea25-4498-9999-2ae741ba0b3a" containerID="73ef5489a2fa1b6f9b3598a0606c45ff53ba4eb9e97ae4fe13b39f198dc380d5" exitCode=0 Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.262756 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dvvpc" podStartSLOduration=11.182693346 podStartE2EDuration="22.262735009s" podCreationTimestamp="2026-03-17 00:42:58 +0000 UTC" firstStartedPulling="2026-03-17 00:43:07.201627447 +0000 UTC m=+1261.961079730" lastFinishedPulling="2026-03-17 00:43:18.28166907 +0000 UTC m=+1273.041121393" observedRunningTime="2026-03-17 00:43:20.261527867 +0000 UTC m=+1275.020980160" watchObservedRunningTime="2026-03-17 00:43:20.262735009 +0000 UTC m=+1275.022187292" Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.268632 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" path="/var/lib/kubelet/pods/6bd6657d-d56c-45e7-92cb-ece9a3890c85/volumes" Mar 17 00:43:20 crc kubenswrapper[4755]: I0317 00:43:20.269929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdvbb" event={"ID":"6baf9f03-ea25-4498-9999-2ae741ba0b3a","Type":"ContainerDied","Data":"73ef5489a2fa1b6f9b3598a0606c45ff53ba4eb9e97ae4fe13b39f198dc380d5"} Mar 17 00:43:21 crc kubenswrapper[4755]: I0317 00:43:21.260548 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdvbb" event={"ID":"6baf9f03-ea25-4498-9999-2ae741ba0b3a","Type":"ContainerStarted","Data":"76d0e2b1db6f73ebd68c27dec260845ed7d7d5628ef4f8872bdaaf298268a0a1"} Mar 17 00:43:22 crc kubenswrapper[4755]: I0317 00:43:22.271395 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerStarted","Data":"4807d127d9a739befc515b172da1f6bb129a30b65aac8982d795fdc0e475d2f2"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.283538 4755 generic.go:334] "Generic (PLEG): container finished" podID="e48be2ab-6e3e-4a75-b47e-e700bd4126f1" containerID="569c481101010ad0798e9bd6deb937b987a9cabdd45285e784be7fbe49daada5" exitCode=0 Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.283820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e48be2ab-6e3e-4a75-b47e-e700bd4126f1","Type":"ContainerDied","Data":"569c481101010ad0798e9bd6deb937b987a9cabdd45285e784be7fbe49daada5"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.288188 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdvbb" event={"ID":"6baf9f03-ea25-4498-9999-2ae741ba0b3a","Type":"ContainerStarted","Data":"4452118c750220ab86562bc09a7ace37166bc41ab1383f814118e0a5881006e4"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.288356 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.293137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6e8532d-a845-4882-a690-09c072e39311","Type":"ContainerStarted","Data":"742d83640454b06a894e3e3b40ccb80e43b13bd86da88d4b98cdaa7dcc09ef80"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.296464 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"54c4fe64-c4f8-4e77-9029-946580816bf7","Type":"ContainerStarted","Data":"a87de971fe4a2c5c665553f1e8d155c0c3541ff9fd1e273f9a34a7b6a809be93"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.300294 4755 generic.go:334] "Generic (PLEG): container finished" podID="dfea0511-e194-48c8-8795-58d07ada5d4c" containerID="7c151d708a9bba8c87a9099369994bedd5c424ac7969ecfc32eb4126f42b590a" exitCode=0 Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.300377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dfea0511-e194-48c8-8795-58d07ada5d4c","Type":"ContainerDied","Data":"7c151d708a9bba8c87a9099369994bedd5c424ac7969ecfc32eb4126f42b590a"} Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.342125 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.153739367 podStartE2EDuration="25.34210145s" podCreationTimestamp="2026-03-17 00:42:58 +0000 UTC" firstStartedPulling="2026-03-17 00:43:07.397700697 +0000 UTC m=+1262.157152970" lastFinishedPulling="2026-03-17 00:43:22.58606277 +0000 UTC m=+1277.345515053" observedRunningTime="2026-03-17 00:43:23.339929292 +0000 UTC m=+1278.099381605" watchObservedRunningTime="2026-03-17 00:43:23.34210145 +0000 UTC m=+1278.101553743" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.388627 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bdvbb" podStartSLOduration=16.154322422 podStartE2EDuration="25.388597521s" podCreationTimestamp="2026-03-17 00:42:58 +0000 UTC" firstStartedPulling="2026-03-17 00:43:09.057664311 +0000 UTC m=+1263.817116594" lastFinishedPulling="2026-03-17 00:43:18.29193938 +0000 UTC m=+1273.051391693" observedRunningTime="2026-03-17 00:43:23.386575627 +0000 UTC m=+1278.146027910" watchObservedRunningTime="2026-03-17 00:43:23.388597521 +0000 UTC m=+1278.148049834" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.426746 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.911727417 podStartE2EDuration="22.426725522s" podCreationTimestamp="2026-03-17 00:43:01 +0000 UTC" firstStartedPulling="2026-03-17 00:43:09.055340461 +0000 UTC m=+1263.814792744" lastFinishedPulling="2026-03-17 00:43:22.570338566 +0000 UTC m=+1277.329790849" observedRunningTime="2026-03-17 00:43:23.423055075 +0000 UTC m=+1278.182507358" watchObservedRunningTime="2026-03-17 00:43:23.426725522 +0000 UTC m=+1278.186177805" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.487321 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.523000 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.553245 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 17 00:43:23 crc kubenswrapper[4755]: I0317 00:43:23.882578 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.088308 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.131197 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.312336 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.312399 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.372852 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.385883 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.664775 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-d8xms"] Mar 17 00:43:24 crc kubenswrapper[4755]: E0317 00:43:24.665111 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="init" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.665126 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="init" Mar 17 00:43:24 crc kubenswrapper[4755]: E0317 00:43:24.665138 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="dnsmasq-dns" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.665144 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="dnsmasq-dns" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.665310 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd6657d-d56c-45e7-92cb-ece9a3890c85" containerName="dnsmasq-dns" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.666180 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.669352 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.688008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-d8xms"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.734135 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lccfn"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.735209 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.741830 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.770146 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lccfn"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.776305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.776371 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.778956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9w66\" (UniqueName: \"kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovn-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-combined-ca-bundle\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wmr\" (UniqueName: \"kubernetes.io/projected/71d7e3dc-df60-416b-add1-b7f55fd74d2d-kube-api-access-86wmr\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovs-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.779411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d7e3dc-df60-416b-add1-b7f55fd74d2d-config\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.811292 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.812846 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.815574 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hnpzm" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.815801 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.815947 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.816073 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.817905 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.837286 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-d8xms"] Mar 17 00:43:24 crc kubenswrapper[4755]: E0317 00:43:24.838119 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-k9w66 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" podUID="d5e28978-31e3-4d48-a3f8-78a481e83dd7" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.846339 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.847866 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.851620 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.873959 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.884504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-combined-ca-bundle\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.884557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wmr\" (UniqueName: \"kubernetes.io/projected/71d7e3dc-df60-416b-add1-b7f55fd74d2d-kube-api-access-86wmr\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.884597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.884684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885576 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovs-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885615 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885661 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-config\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-scripts\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d7e3dc-df60-416b-add1-b7f55fd74d2d-config\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885786 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovs-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885805 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885908 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.885990 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886031 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-582wf\" (UniqueName: \"kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgq7j\" (UniqueName: \"kubernetes.io/projected/b9c6a29f-013e-40dc-958a-05f36cb4e626-kube-api-access-zgq7j\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886113 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9w66\" (UniqueName: \"kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886243 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovn-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886304 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71d7e3dc-df60-416b-add1-b7f55fd74d2d-config\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.886663 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/71d7e3dc-df60-416b-add1-b7f55fd74d2d-ovn-rundir\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.887308 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.895536 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-combined-ca-bundle\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.912653 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d7e3dc-df60-416b-add1-b7f55fd74d2d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.917555 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wmr\" (UniqueName: \"kubernetes.io/projected/71d7e3dc-df60-416b-add1-b7f55fd74d2d-kube-api-access-86wmr\") pod \"ovn-controller-metrics-lccfn\" (UID: \"71d7e3dc-df60-416b-add1-b7f55fd74d2d\") " pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.922234 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9w66\" (UniqueName: \"kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66\") pod \"dnsmasq-dns-7f896c8c65-d8xms\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988758 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-config\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-scripts\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988911 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988938 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.988970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-582wf\" (UniqueName: \"kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.989010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgq7j\" (UniqueName: \"kubernetes.io/projected/b9c6a29f-013e-40dc-958a-05f36cb4e626-kube-api-access-zgq7j\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.989263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.990394 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.990487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-scripts\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.991029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.991167 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c6a29f-013e-40dc-958a-05f36cb4e626-config\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.991762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.992704 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.992977 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.994329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:24 crc kubenswrapper[4755]: I0317 00:43:24.994960 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c6a29f-013e-40dc-958a-05f36cb4e626-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.015512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-582wf\" (UniqueName: \"kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf\") pod \"dnsmasq-dns-86db49b7ff-xwlzk\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.019358 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgq7j\" (UniqueName: \"kubernetes.io/projected/b9c6a29f-013e-40dc-958a-05f36cb4e626-kube-api-access-zgq7j\") pod \"ovn-northd-0\" (UID: \"b9c6a29f-013e-40dc-958a-05f36cb4e626\") " pod="openstack/ovn-northd-0" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.073242 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lccfn" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.130180 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.173140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.332012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"dfea0511-e194-48c8-8795-58d07ada5d4c","Type":"ContainerStarted","Data":"db4f2f21c25b17ddf59860c56cec4a70b6b5d6e0c2ab157df1810d45e863f10a"} Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.349606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.362027 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.396343 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc\") pod \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.396392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config\") pod \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.396534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb\") pod \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.396608 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9w66\" (UniqueName: \"kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66\") pod \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\" (UID: \"d5e28978-31e3-4d48-a3f8-78a481e83dd7\") " Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397089 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config" (OuterVolumeSpecName: "config") pod "d5e28978-31e3-4d48-a3f8-78a481e83dd7" (UID: "d5e28978-31e3-4d48-a3f8-78a481e83dd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397174 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5e28978-31e3-4d48-a3f8-78a481e83dd7" (UID: "d5e28978-31e3-4d48-a3f8-78a481e83dd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397281 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5e28978-31e3-4d48-a3f8-78a481e83dd7" (UID: "d5e28978-31e3-4d48-a3f8-78a481e83dd7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397354 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397367 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.397375 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e28978-31e3-4d48-a3f8-78a481e83dd7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.402498 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66" (OuterVolumeSpecName: "kube-api-access-k9w66") pod "d5e28978-31e3-4d48-a3f8-78a481e83dd7" (UID: "d5e28978-31e3-4d48-a3f8-78a481e83dd7"). InnerVolumeSpecName "kube-api-access-k9w66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.498638 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9w66\" (UniqueName: \"kubernetes.io/projected/d5e28978-31e3-4d48-a3f8-78a481e83dd7-kube-api-access-k9w66\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.628034 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lccfn"] Mar 17 00:43:25 crc kubenswrapper[4755]: W0317 00:43:25.630418 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71d7e3dc_df60_416b_add1_b7f55fd74d2d.slice/crio-7568416cc3ec5e4ba24ac4a26d0fdf1bcb43890ae8517da9130b14f13da0a22d WatchSource:0}: Error finding container 7568416cc3ec5e4ba24ac4a26d0fdf1bcb43890ae8517da9130b14f13da0a22d: Status 404 returned error can't find the container with id 7568416cc3ec5e4ba24ac4a26d0fdf1bcb43890ae8517da9130b14f13da0a22d Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.635475 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 00:43:25 crc kubenswrapper[4755]: W0317 00:43:25.636601 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c6a29f_013e_40dc_958a_05f36cb4e626.slice/crio-44dade3c1ecb20a124561a29537b054ce68cebb6dc8780373412dc4a28827273 WatchSource:0}: Error finding container 44dade3c1ecb20a124561a29537b054ce68cebb6dc8780373412dc4a28827273: Status 404 returned error can't find the container with id 44dade3c1ecb20a124561a29537b054ce68cebb6dc8780373412dc4a28827273 Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.797549 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.799833 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.908947 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.942519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.943874 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:25 crc kubenswrapper[4755]: I0317 00:43:25.974479 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.007344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.007962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.014323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdsdf\" (UniqueName: \"kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.014405 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.014554 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.115818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.115888 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.115984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.116021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.116038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdsdf\" (UniqueName: \"kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.117158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.117158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.117170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.119158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.133528 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdsdf\" (UniqueName: \"kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf\") pod \"dnsmasq-dns-698758b865-m5788\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.358040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e48be2ab-6e3e-4a75-b47e-e700bd4126f1","Type":"ContainerStarted","Data":"c19322756517d3ea4b24062ac067088bd158663197b8480a597423e456b37f5a"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.362642 4755 generic.go:334] "Generic (PLEG): container finished" podID="9adf62fd-8052-401d-aacd-a7f974664439" containerID="1d7fe1e7e5d9c5fa2951c523eeeaf04001964d8070f01a165ddc33dfcdc51983" exitCode=0 Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.362739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" event={"ID":"9adf62fd-8052-401d-aacd-a7f974664439","Type":"ContainerDied","Data":"1d7fe1e7e5d9c5fa2951c523eeeaf04001964d8070f01a165ddc33dfcdc51983"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.362771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" event={"ID":"9adf62fd-8052-401d-aacd-a7f974664439","Type":"ContainerStarted","Data":"7abf52afd39e7c579591225e9334c8a29057f82643d7ce2f1f1a85a93c08a859"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.366240 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9c6a29f-013e-40dc-958a-05f36cb4e626","Type":"ContainerStarted","Data":"44dade3c1ecb20a124561a29537b054ce68cebb6dc8780373412dc4a28827273"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.369665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lccfn" event={"ID":"71d7e3dc-df60-416b-add1-b7f55fd74d2d","Type":"ContainerStarted","Data":"5c3e338aef3bb873c3c77b064e5f5b1734b43d2425dbe06f59e1c2d1ddc924a0"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.369833 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lccfn" event={"ID":"71d7e3dc-df60-416b-add1-b7f55fd74d2d","Type":"ContainerStarted","Data":"7568416cc3ec5e4ba24ac4a26d0fdf1bcb43890ae8517da9130b14f13da0a22d"} Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.371085 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-d8xms" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.387634 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.311090058 podStartE2EDuration="36.38761658s" podCreationTimestamp="2026-03-17 00:42:50 +0000 UTC" firstStartedPulling="2026-03-17 00:43:06.970565547 +0000 UTC m=+1261.730017830" lastFinishedPulling="2026-03-17 00:43:18.047092069 +0000 UTC m=+1272.806544352" observedRunningTime="2026-03-17 00:43:26.382803933 +0000 UTC m=+1281.142256216" watchObservedRunningTime="2026-03-17 00:43:26.38761658 +0000 UTC m=+1281.147068873" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.420134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.425918 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lccfn" podStartSLOduration=2.425889345 podStartE2EDuration="2.425889345s" podCreationTimestamp="2026-03-17 00:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:43:26.41390937 +0000 UTC m=+1281.173361663" watchObservedRunningTime="2026-03-17 00:43:26.425889345 +0000 UTC m=+1281.185341638" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.462212 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-d8xms"] Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.472515 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.516841342 podStartE2EDuration="35.472493889s" podCreationTimestamp="2026-03-17 00:42:51 +0000 UTC" firstStartedPulling="2026-03-17 00:43:06.962972277 +0000 UTC m=+1261.722424570" lastFinishedPulling="2026-03-17 00:43:17.918624834 +0000 UTC m=+1272.678077117" observedRunningTime="2026-03-17 00:43:26.464027857 +0000 UTC m=+1281.223480160" watchObservedRunningTime="2026-03-17 00:43:26.472493889 +0000 UTC m=+1281.231946172" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.475539 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-d8xms"] Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.731955 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.827800 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb\") pod \"9adf62fd-8052-401d-aacd-a7f974664439\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.827975 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-582wf\" (UniqueName: \"kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf\") pod \"9adf62fd-8052-401d-aacd-a7f974664439\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.828108 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config\") pod \"9adf62fd-8052-401d-aacd-a7f974664439\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.828166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb\") pod \"9adf62fd-8052-401d-aacd-a7f974664439\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.828199 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc\") pod \"9adf62fd-8052-401d-aacd-a7f974664439\" (UID: \"9adf62fd-8052-401d-aacd-a7f974664439\") " Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.841301 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf" (OuterVolumeSpecName: "kube-api-access-582wf") pod "9adf62fd-8052-401d-aacd-a7f974664439" (UID: "9adf62fd-8052-401d-aacd-a7f974664439"). InnerVolumeSpecName "kube-api-access-582wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.850421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9adf62fd-8052-401d-aacd-a7f974664439" (UID: "9adf62fd-8052-401d-aacd-a7f974664439"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.858071 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9adf62fd-8052-401d-aacd-a7f974664439" (UID: "9adf62fd-8052-401d-aacd-a7f974664439"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.861060 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9adf62fd-8052-401d-aacd-a7f974664439" (UID: "9adf62fd-8052-401d-aacd-a7f974664439"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.872126 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config" (OuterVolumeSpecName: "config") pod "9adf62fd-8052-401d-aacd-a7f974664439" (UID: "9adf62fd-8052-401d-aacd-a7f974664439"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.930588 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-582wf\" (UniqueName: \"kubernetes.io/projected/9adf62fd-8052-401d-aacd-a7f974664439-kube-api-access-582wf\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.931074 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.931085 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.931094 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:26 crc kubenswrapper[4755]: I0317 00:43:26.931102 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9adf62fd-8052-401d-aacd-a7f974664439-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.030229 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.030591 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9adf62fd-8052-401d-aacd-a7f974664439" containerName="init" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.030608 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9adf62fd-8052-401d-aacd-a7f974664439" containerName="init" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.030789 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9adf62fd-8052-401d-aacd-a7f974664439" containerName="init" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.037721 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.043016 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.044927 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.045376 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mhtwf" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.045682 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.046081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.061007 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.138958 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-cache\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.139006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.139052 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-lock\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.139072 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpk7d\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-kube-api-access-jpk7d\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.139629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.139685 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ee6df5-abef-4094-aabc-45b15e1ebfcf-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244523 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-lock\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244569 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpk7d\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-kube-api-access-jpk7d\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ee6df5-abef-4094-aabc-45b15e1ebfcf-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-cache\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.244792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.244953 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.244975 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.245024 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift podName:80ee6df5-abef-4094-aabc-45b15e1ebfcf nodeName:}" failed. No retries permitted until 2026-03-17 00:43:27.745004492 +0000 UTC m=+1282.504456775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift") pod "swift-storage-0" (UID: "80ee6df5-abef-4094-aabc-45b15e1ebfcf") : configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.245275 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.246051 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-lock\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.246309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/80ee6df5-abef-4094-aabc-45b15e1ebfcf-cache\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.266865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ee6df5-abef-4094-aabc-45b15e1ebfcf-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.292358 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpk7d\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-kube-api-access-jpk7d\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.314397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.386694 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-m5788" event={"ID":"2cd96a96-5f56-4b8a-a198-8d7ad6b81018","Type":"ContainerStarted","Data":"aef963831347f5acb50e6bcc64371f90ea725f7dd4bb90b68d19a60fc2849f29"} Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.391762 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" event={"ID":"9adf62fd-8052-401d-aacd-a7f974664439","Type":"ContainerDied","Data":"7abf52afd39e7c579591225e9334c8a29057f82643d7ce2f1f1a85a93c08a859"} Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.392084 4755 scope.go:117] "RemoveContainer" containerID="1d7fe1e7e5d9c5fa2951c523eeeaf04001964d8070f01a165ddc33dfcdc51983" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.391834 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xwlzk" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.463387 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.470871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xwlzk"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.502055 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mh59s"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.503142 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.513942 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.514210 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.514765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.523720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mh59s"] Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vh58\" (UniqueName: \"kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658644 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658816 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.658841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760618 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vh58\" (UniqueName: \"kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760754 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.760962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.761428 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.761516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.761733 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.761786 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: E0317 00:43:27.761882 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift podName:80ee6df5-abef-4094-aabc-45b15e1ebfcf nodeName:}" failed. No retries permitted until 2026-03-17 00:43:28.761851198 +0000 UTC m=+1283.521303551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift") pod "swift-storage-0" (UID: "80ee6df5-abef-4094-aabc-45b15e1ebfcf") : configmap "swift-ring-files" not found Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.763180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.764672 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.768062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.775998 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.786372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vh58\" (UniqueName: \"kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58\") pod \"swift-ring-rebalance-mh59s\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:27 crc kubenswrapper[4755]: I0317 00:43:27.948698 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.258167 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9adf62fd-8052-401d-aacd-a7f974664439" path="/var/lib/kubelet/pods/9adf62fd-8052-401d-aacd-a7f974664439/volumes" Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.259354 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e28978-31e3-4d48-a3f8-78a481e83dd7" path="/var/lib/kubelet/pods/d5e28978-31e3-4d48-a3f8-78a481e83dd7/volumes" Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.400482 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-m5788" event={"ID":"2cd96a96-5f56-4b8a-a198-8d7ad6b81018","Type":"ContainerDied","Data":"854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6"} Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.400478 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerID="854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6" exitCode=0 Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.403760 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerID="4807d127d9a739befc515b172da1f6bb129a30b65aac8982d795fdc0e475d2f2" exitCode=0 Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.403867 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerDied","Data":"4807d127d9a739befc515b172da1f6bb129a30b65aac8982d795fdc0e475d2f2"} Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.409134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9c6a29f-013e-40dc-958a-05f36cb4e626","Type":"ContainerStarted","Data":"02d7457558185c25a3c90772ce3ce94191eeed43820f71940e7f3a1258bf6bc9"} Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.409183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9c6a29f-013e-40dc-958a-05f36cb4e626","Type":"ContainerStarted","Data":"99d7951140220797a5a254e963f0abe2ed5fb465d909ac4f4e664e3020b3c49c"} Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.410183 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.458784 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.848974419 podStartE2EDuration="4.458763245s" podCreationTimestamp="2026-03-17 00:43:24 +0000 UTC" firstStartedPulling="2026-03-17 00:43:25.63806963 +0000 UTC m=+1280.397521913" lastFinishedPulling="2026-03-17 00:43:27.247858456 +0000 UTC m=+1282.007310739" observedRunningTime="2026-03-17 00:43:28.453519177 +0000 UTC m=+1283.212971460" watchObservedRunningTime="2026-03-17 00:43:28.458763245 +0000 UTC m=+1283.218215518" Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.550154 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mh59s"] Mar 17 00:43:28 crc kubenswrapper[4755]: W0317 00:43:28.558055 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b280073_a793_4c35_a29b_d56ccf6037a7.slice/crio-8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495 WatchSource:0}: Error finding container 8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495: Status 404 returned error can't find the container with id 8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495 Mar 17 00:43:28 crc kubenswrapper[4755]: I0317 00:43:28.781201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:28 crc kubenswrapper[4755]: E0317 00:43:28.781386 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 00:43:28 crc kubenswrapper[4755]: E0317 00:43:28.781617 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 00:43:28 crc kubenswrapper[4755]: E0317 00:43:28.781671 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift podName:80ee6df5-abef-4094-aabc-45b15e1ebfcf nodeName:}" failed. No retries permitted until 2026-03-17 00:43:30.781652047 +0000 UTC m=+1285.541104330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift") pod "swift-storage-0" (UID: "80ee6df5-abef-4094-aabc-45b15e1ebfcf") : configmap "swift-ring-files" not found Mar 17 00:43:29 crc kubenswrapper[4755]: I0317 00:43:29.422080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mh59s" event={"ID":"9b280073-a793-4c35-a29b-d56ccf6037a7","Type":"ContainerStarted","Data":"8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495"} Mar 17 00:43:29 crc kubenswrapper[4755]: I0317 00:43:29.429735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-m5788" event={"ID":"2cd96a96-5f56-4b8a-a198-8d7ad6b81018","Type":"ContainerStarted","Data":"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471"} Mar 17 00:43:29 crc kubenswrapper[4755]: I0317 00:43:29.430026 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:29 crc kubenswrapper[4755]: I0317 00:43:29.449777 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-m5788" podStartSLOduration=4.449757537 podStartE2EDuration="4.449757537s" podCreationTimestamp="2026-03-17 00:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:43:29.448294149 +0000 UTC m=+1284.207746452" watchObservedRunningTime="2026-03-17 00:43:29.449757537 +0000 UTC m=+1284.209209820" Mar 17 00:43:30 crc kubenswrapper[4755]: I0317 00:43:30.856110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:30 crc kubenswrapper[4755]: E0317 00:43:30.856319 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 00:43:30 crc kubenswrapper[4755]: E0317 00:43:30.856466 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 00:43:30 crc kubenswrapper[4755]: E0317 00:43:30.856531 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift podName:80ee6df5-abef-4094-aabc-45b15e1ebfcf nodeName:}" failed. No retries permitted until 2026-03-17 00:43:34.856508639 +0000 UTC m=+1289.615960922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift") pod "swift-storage-0" (UID: "80ee6df5-abef-4094-aabc-45b15e1ebfcf") : configmap "swift-ring-files" not found Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.204025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.204244 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.272325 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.472735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mh59s" event={"ID":"9b280073-a793-4c35-a29b-d56ccf6037a7","Type":"ContainerStarted","Data":"354132efbae9dd8e2ca6a3cb935de6c0ec4453c2e3bc490a380dd53628b41540"} Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.568502 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 17 00:43:32 crc kubenswrapper[4755]: I0317 00:43:32.590152 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mh59s" podStartSLOduration=2.25267327 podStartE2EDuration="5.590135189s" podCreationTimestamp="2026-03-17 00:43:27 +0000 UTC" firstStartedPulling="2026-03-17 00:43:28.560829496 +0000 UTC m=+1283.320281779" lastFinishedPulling="2026-03-17 00:43:31.898291415 +0000 UTC m=+1286.657743698" observedRunningTime="2026-03-17 00:43:32.495060002 +0000 UTC m=+1287.254512285" watchObservedRunningTime="2026-03-17 00:43:32.590135189 +0000 UTC m=+1287.349587472" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.355582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.356387 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.458108 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.560290 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.856171 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jmh2r"] Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.857352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.876048 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jmh2r"] Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.943008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.943096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgff\" (UniqueName: \"kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.945141 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-374e-account-create-update-kv6bv"] Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.946416 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.952666 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-374e-account-create-update-kv6bv"] Mar 17 00:43:33 crc kubenswrapper[4755]: I0317 00:43:33.989482 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.044205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.044274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgff\" (UniqueName: \"kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.044304 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9rdz\" (UniqueName: \"kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.044343 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.044982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.061867 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgff\" (UniqueName: \"kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff\") pod \"glance-db-create-jmh2r\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.146753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9rdz\" (UniqueName: \"kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.146853 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.147792 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.163112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9rdz\" (UniqueName: \"kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz\") pod \"glance-374e-account-create-update-kv6bv\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.183782 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.307093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.671903 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9vkmp"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.673615 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.689604 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9vkmp"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.763796 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.763976 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6vx\" (UniqueName: \"kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.784198 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f0e-account-create-update-pfs54"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.785888 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.791712 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.793864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f0e-account-create-update-pfs54"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.865730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.865862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.865953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6vx\" (UniqueName: \"kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.865973 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4h2\" (UniqueName: \"kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.866005 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:34 crc kubenswrapper[4755]: E0317 00:43:34.866142 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 00:43:34 crc kubenswrapper[4755]: E0317 00:43:34.866155 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 00:43:34 crc kubenswrapper[4755]: E0317 00:43:34.866192 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift podName:80ee6df5-abef-4094-aabc-45b15e1ebfcf nodeName:}" failed. No retries permitted until 2026-03-17 00:43:42.866178686 +0000 UTC m=+1297.625630969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift") pod "swift-storage-0" (UID: "80ee6df5-abef-4094-aabc-45b15e1ebfcf") : configmap "swift-ring-files" not found Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.867188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.874916 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ckpfn"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.876505 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.893583 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ckpfn"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.896034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6vx\" (UniqueName: \"kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx\") pod \"keystone-db-create-9vkmp\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.902832 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-552a-account-create-update-ljkfk"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.904910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.907916 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.944482 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-552a-account-create-update-ljkfk"] Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.970740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jns\" (UniqueName: \"kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.970825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4h2\" (UniqueName: \"kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.970883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.970916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.970948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.971082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxx27\" (UniqueName: \"kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:34 crc kubenswrapper[4755]: I0317 00:43:34.972720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:34.989759 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4h2\" (UniqueName: \"kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2\") pod \"keystone-5f0e-account-create-update-pfs54\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:34.990396 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.074414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxx27\" (UniqueName: \"kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.074493 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jns\" (UniqueName: \"kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.074549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.074572 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.075203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.075526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.095118 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxx27\" (UniqueName: \"kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27\") pod \"placement-552a-account-create-update-ljkfk\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.101151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jns\" (UniqueName: \"kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns\") pod \"placement-db-create-ckpfn\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.185056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.240577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.251561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.497953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerStarted","Data":"e04dbd8528cf206deadfac630c56d005dfbc5693cdfdbf060355d0b4fff00374"} Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.588102 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-nc5wg"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.589489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.615630 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-nc5wg"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.691887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.691965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qccr\" (UniqueName: \"kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.765576 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e09f-account-create-update-ccmws"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.766780 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.770912 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.775294 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e09f-account-create-update-ccmws"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.794263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.794354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qccr\" (UniqueName: \"kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.795364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.814383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qccr\" (UniqueName: \"kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr\") pod \"mysqld-exporter-openstack-db-create-nc5wg\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.887817 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-374e-account-create-update-kv6bv"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.896464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.896614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pt2l\" (UniqueName: \"kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:35 crc kubenswrapper[4755]: W0317 00:43:35.902263 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d32524_c98c_4d9e_abbf_3231c1b18e44.slice/crio-05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94 WatchSource:0}: Error finding container 05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94: Status 404 returned error can't find the container with id 05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94 Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.916116 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.990570 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f0e-account-create-update-pfs54"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.996906 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jmh2r"] Mar 17 00:43:35 crc kubenswrapper[4755]: I0317 00:43:35.999885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:35.999996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pt2l\" (UniqueName: \"kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.000879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:36 crc kubenswrapper[4755]: W0317 00:43:36.006242 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode769064f_9469_4183_bd71_52ed11230e0e.slice/crio-0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14 WatchSource:0}: Error finding container 0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14: Status 404 returned error can't find the container with id 0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14 Mar 17 00:43:36 crc kubenswrapper[4755]: W0317 00:43:36.010325 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec87d4bf_d241_4b94_b3f7_0f006e4ceb87.slice/crio-ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554 WatchSource:0}: Error finding container ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554: Status 404 returned error can't find the container with id ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554 Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.015857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pt2l\" (UniqueName: \"kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l\") pod \"mysqld-exporter-e09f-account-create-update-ccmws\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.091008 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.203544 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ckpfn"] Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.216627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9vkmp"] Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.239159 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-552a-account-create-update-ljkfk"] Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.422678 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.489811 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.490152 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="dnsmasq-dns" containerID="cri-o://bba9337a0a86997f5b8df36258f89b93c8084d68107a2458fb0b8a3c78015f50" gracePeriod=10 Mar 17 00:43:36 crc kubenswrapper[4755]: E0317 00:43:36.532069 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d32524_c98c_4d9e_abbf_3231c1b18e44.slice/crio-210c2f9b17f445ac8e69afc50d3b3edcc23f2f340b5905a0693cc2402895429d.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.535137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jmh2r" event={"ID":"e769064f-9469-4183-bd71-52ed11230e0e","Type":"ContainerStarted","Data":"0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.536018 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-nc5wg"] Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.539611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-552a-account-create-update-ljkfk" event={"ID":"9964cd2d-4d04-4954-8ba1-0379d75a932f","Type":"ContainerStarted","Data":"a0844063a17f686aa2739906ff3dc1c17c0d87db29bebb2b9e9da938d25c13bc"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.544932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9vkmp" event={"ID":"22aa45d9-d0a3-4dad-98a9-293f6c396229","Type":"ContainerStarted","Data":"e5848d4457e24f9804dadad33c6478660c26effe2422d325b2b34bb4b0526cd2"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.546224 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f0e-account-create-update-pfs54" event={"ID":"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87","Type":"ContainerStarted","Data":"ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.547313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ckpfn" event={"ID":"554c2862-dfb9-4910-9d14-3fed242964ed","Type":"ContainerStarted","Data":"1cf788ca00c8f12db6fd65d9eccab257ff2f604e8ac4679fc62fa37f019f3830"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.548855 4755 generic.go:334] "Generic (PLEG): container finished" podID="72d32524-c98c-4d9e-abbf-3231c1b18e44" containerID="210c2f9b17f445ac8e69afc50d3b3edcc23f2f340b5905a0693cc2402895429d" exitCode=0 Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.548892 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-374e-account-create-update-kv6bv" event={"ID":"72d32524-c98c-4d9e-abbf-3231c1b18e44","Type":"ContainerDied","Data":"210c2f9b17f445ac8e69afc50d3b3edcc23f2f340b5905a0693cc2402895429d"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.548913 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-374e-account-create-update-kv6bv" event={"ID":"72d32524-c98c-4d9e-abbf-3231c1b18e44","Type":"ContainerStarted","Data":"05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94"} Mar 17 00:43:36 crc kubenswrapper[4755]: I0317 00:43:36.631179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e09f-account-create-update-ccmws"] Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.559400 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9vkmp" event={"ID":"22aa45d9-d0a3-4dad-98a9-293f6c396229","Type":"ContainerStarted","Data":"744e7e2542f4159e1f0372413e155f63f4c426dd6d8f012bf4e573d0d8b972a9"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.562541 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" event={"ID":"5efb20b9-cf6f-4c8e-9afc-92d6713630f2","Type":"ContainerStarted","Data":"6d3217a02a5d2a4ead9e08421b1b49e9feb13d9225ebbfd5ac688eb67192e2b3"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.564175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f0e-account-create-update-pfs54" event={"ID":"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87","Type":"ContainerStarted","Data":"4c519ab6fc149df41954fce5d2a54b0b893ca88bb38c491148dc4eb0f6d8b0e6"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.567831 4755 generic.go:334] "Generic (PLEG): container finished" podID="98b1d827-2b15-4213-b59a-39e3ac08b962" containerID="3a30bad658ae849d03ce5cd0f2be34ef0661e0b1e2c81fff2437ef2eaa585e4b" exitCode=0 Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.567901 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" event={"ID":"98b1d827-2b15-4213-b59a-39e3ac08b962","Type":"ContainerDied","Data":"3a30bad658ae849d03ce5cd0f2be34ef0661e0b1e2c81fff2437ef2eaa585e4b"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.567925 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" event={"ID":"98b1d827-2b15-4213-b59a-39e3ac08b962","Type":"ContainerStarted","Data":"801ca11c1b5d6e76abe9f47496719a5f691d6d0d169a9aab9fe9a14bd048a4ce"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.570075 4755 generic.go:334] "Generic (PLEG): container finished" podID="554c2862-dfb9-4910-9d14-3fed242964ed" containerID="c6565263fbbcb465c616c5d6ae268234c0d18461a554fa7f0c3360d656b9b080" exitCode=0 Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.570199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ckpfn" event={"ID":"554c2862-dfb9-4910-9d14-3fed242964ed","Type":"ContainerDied","Data":"c6565263fbbcb465c616c5d6ae268234c0d18461a554fa7f0c3360d656b9b080"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.574265 4755 generic.go:334] "Generic (PLEG): container finished" podID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerID="bba9337a0a86997f5b8df36258f89b93c8084d68107a2458fb0b8a3c78015f50" exitCode=0 Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.574320 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" event={"ID":"6d55848a-c15d-4ed3-899b-bfcbb45f13ff","Type":"ContainerDied","Data":"bba9337a0a86997f5b8df36258f89b93c8084d68107a2458fb0b8a3c78015f50"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.574343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" event={"ID":"6d55848a-c15d-4ed3-899b-bfcbb45f13ff","Type":"ContainerDied","Data":"3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.574356 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cf638d904151450d51699fcc8934b3283b85a366e4f3c38efbe283204a9e098" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.577521 4755 generic.go:334] "Generic (PLEG): container finished" podID="e769064f-9469-4183-bd71-52ed11230e0e" containerID="dbc2372936211dd27bc5d87c1ad5c8d83bd63bca9fb57ba2f17291d9b27b04f5" exitCode=0 Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.577584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jmh2r" event={"ID":"e769064f-9469-4183-bd71-52ed11230e0e","Type":"ContainerDied","Data":"dbc2372936211dd27bc5d87c1ad5c8d83bd63bca9fb57ba2f17291d9b27b04f5"} Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.579526 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.587609 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-9vkmp" podStartSLOduration=3.587587584 podStartE2EDuration="3.587587584s" podCreationTimestamp="2026-03-17 00:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:43:37.572965949 +0000 UTC m=+1292.332418242" watchObservedRunningTime="2026-03-17 00:43:37.587587584 +0000 UTC m=+1292.347039867" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.631552 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config\") pod \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.631648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnz7\" (UniqueName: \"kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7\") pod \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.631698 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc\") pod \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\" (UID: \"6d55848a-c15d-4ed3-899b-bfcbb45f13ff\") " Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.641422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7" (OuterVolumeSpecName: "kube-api-access-hxnz7") pod "6d55848a-c15d-4ed3-899b-bfcbb45f13ff" (UID: "6d55848a-c15d-4ed3-899b-bfcbb45f13ff"). InnerVolumeSpecName "kube-api-access-hxnz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.653461 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5f0e-account-create-update-pfs54" podStartSLOduration=3.6525410799999998 podStartE2EDuration="3.65254108s" podCreationTimestamp="2026-03-17 00:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:43:37.636394916 +0000 UTC m=+1292.395847219" watchObservedRunningTime="2026-03-17 00:43:37.65254108 +0000 UTC m=+1292.411993373" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.697705 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d55848a-c15d-4ed3-899b-bfcbb45f13ff" (UID: "6d55848a-c15d-4ed3-899b-bfcbb45f13ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.734684 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnz7\" (UniqueName: \"kubernetes.io/projected/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-kube-api-access-hxnz7\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.734713 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.737893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config" (OuterVolumeSpecName: "config") pod "6d55848a-c15d-4ed3-899b-bfcbb45f13ff" (UID: "6d55848a-c15d-4ed3-899b-bfcbb45f13ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.836341 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d55848a-c15d-4ed3-899b-bfcbb45f13ff-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:37 crc kubenswrapper[4755]: I0317 00:43:37.964911 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.039781 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9rdz\" (UniqueName: \"kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz\") pod \"72d32524-c98c-4d9e-abbf-3231c1b18e44\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.039940 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts\") pod \"72d32524-c98c-4d9e-abbf-3231c1b18e44\" (UID: \"72d32524-c98c-4d9e-abbf-3231c1b18e44\") " Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.040727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72d32524-c98c-4d9e-abbf-3231c1b18e44" (UID: "72d32524-c98c-4d9e-abbf-3231c1b18e44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.044132 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz" (OuterVolumeSpecName: "kube-api-access-x9rdz") pod "72d32524-c98c-4d9e-abbf-3231c1b18e44" (UID: "72d32524-c98c-4d9e-abbf-3231c1b18e44"). InnerVolumeSpecName "kube-api-access-x9rdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.142257 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d32524-c98c-4d9e-abbf-3231c1b18e44-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.142287 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9rdz\" (UniqueName: \"kubernetes.io/projected/72d32524-c98c-4d9e-abbf-3231c1b18e44-kube-api-access-x9rdz\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.595955 4755 generic.go:334] "Generic (PLEG): container finished" podID="22aa45d9-d0a3-4dad-98a9-293f6c396229" containerID="744e7e2542f4159e1f0372413e155f63f4c426dd6d8f012bf4e573d0d8b972a9" exitCode=0 Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.596057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9vkmp" event={"ID":"22aa45d9-d0a3-4dad-98a9-293f6c396229","Type":"ContainerDied","Data":"744e7e2542f4159e1f0372413e155f63f4c426dd6d8f012bf4e573d0d8b972a9"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.598477 4755 generic.go:334] "Generic (PLEG): container finished" podID="5efb20b9-cf6f-4c8e-9afc-92d6713630f2" containerID="ca6211fd25c7f6916638a97f101388596e4398c0641395f83d31cede23df832b" exitCode=0 Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.598608 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" event={"ID":"5efb20b9-cf6f-4c8e-9afc-92d6713630f2","Type":"ContainerDied","Data":"ca6211fd25c7f6916638a97f101388596e4398c0641395f83d31cede23df832b"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.600743 4755 generic.go:334] "Generic (PLEG): container finished" podID="ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" containerID="4c519ab6fc149df41954fce5d2a54b0b893ca88bb38c491148dc4eb0f6d8b0e6" exitCode=0 Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.600822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f0e-account-create-update-pfs54" event={"ID":"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87","Type":"ContainerDied","Data":"4c519ab6fc149df41954fce5d2a54b0b893ca88bb38c491148dc4eb0f6d8b0e6"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.608080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerStarted","Data":"3a78a7c9f57b03d419c130d5896347893570c516e901c8ad01defb99516ef10d"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.610147 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-374e-account-create-update-kv6bv" event={"ID":"72d32524-c98c-4d9e-abbf-3231c1b18e44","Type":"ContainerDied","Data":"05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.610220 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c5be39ff1b84ffcf8f60f4edd5d0c63505217813f261a273b332502b1a5b94" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.610346 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-374e-account-create-update-kv6bv" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.614276 4755 generic.go:334] "Generic (PLEG): container finished" podID="9964cd2d-4d04-4954-8ba1-0379d75a932f" containerID="447451244cf67b8df101bd402f87b44e3b32be6df9e8fb35f9cf4361b50e7aec" exitCode=0 Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.614357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-552a-account-create-update-ljkfk" event={"ID":"9964cd2d-4d04-4954-8ba1-0379d75a932f","Type":"ContainerDied","Data":"447451244cf67b8df101bd402f87b44e3b32be6df9e8fb35f9cf4361b50e7aec"} Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.614567 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-26pzj" Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.721508 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:43:38 crc kubenswrapper[4755]: I0317 00:43:38.728628 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-26pzj"] Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.112132 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.163425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djgff\" (UniqueName: \"kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff\") pod \"e769064f-9469-4183-bd71-52ed11230e0e\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.163543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts\") pod \"e769064f-9469-4183-bd71-52ed11230e0e\" (UID: \"e769064f-9469-4183-bd71-52ed11230e0e\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.164541 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e769064f-9469-4183-bd71-52ed11230e0e" (UID: "e769064f-9469-4183-bd71-52ed11230e0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.167274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff" (OuterVolumeSpecName: "kube-api-access-djgff") pod "e769064f-9469-4183-bd71-52ed11230e0e" (UID: "e769064f-9469-4183-bd71-52ed11230e0e"). InnerVolumeSpecName "kube-api-access-djgff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.230954 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.234553 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.300987 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4jns\" (UniqueName: \"kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns\") pod \"554c2862-dfb9-4910-9d14-3fed242964ed\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.301033 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qccr\" (UniqueName: \"kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr\") pod \"98b1d827-2b15-4213-b59a-39e3ac08b962\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.301063 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts\") pod \"98b1d827-2b15-4213-b59a-39e3ac08b962\" (UID: \"98b1d827-2b15-4213-b59a-39e3ac08b962\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.301232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts\") pod \"554c2862-dfb9-4910-9d14-3fed242964ed\" (UID: \"554c2862-dfb9-4910-9d14-3fed242964ed\") " Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.301830 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98b1d827-2b15-4213-b59a-39e3ac08b962" (UID: "98b1d827-2b15-4213-b59a-39e3ac08b962"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.302148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "554c2862-dfb9-4910-9d14-3fed242964ed" (UID: "554c2862-dfb9-4910-9d14-3fed242964ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.302654 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98b1d827-2b15-4213-b59a-39e3ac08b962-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.302672 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djgff\" (UniqueName: \"kubernetes.io/projected/e769064f-9469-4183-bd71-52ed11230e0e-kube-api-access-djgff\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.302683 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e769064f-9469-4183-bd71-52ed11230e0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.302691 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/554c2862-dfb9-4910-9d14-3fed242964ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.306602 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr" (OuterVolumeSpecName: "kube-api-access-9qccr") pod "98b1d827-2b15-4213-b59a-39e3ac08b962" (UID: "98b1d827-2b15-4213-b59a-39e3ac08b962"). InnerVolumeSpecName "kube-api-access-9qccr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.312945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns" (OuterVolumeSpecName: "kube-api-access-b4jns") pod "554c2862-dfb9-4910-9d14-3fed242964ed" (UID: "554c2862-dfb9-4910-9d14-3fed242964ed"). InnerVolumeSpecName "kube-api-access-b4jns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.404279 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4jns\" (UniqueName: \"kubernetes.io/projected/554c2862-dfb9-4910-9d14-3fed242964ed-kube-api-access-b4jns\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.404314 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qccr\" (UniqueName: \"kubernetes.io/projected/98b1d827-2b15-4213-b59a-39e3ac08b962-kube-api-access-9qccr\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.624939 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b280073-a793-4c35-a29b-d56ccf6037a7" containerID="354132efbae9dd8e2ca6a3cb935de6c0ec4453c2e3bc490a380dd53628b41540" exitCode=0 Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.624999 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mh59s" event={"ID":"9b280073-a793-4c35-a29b-d56ccf6037a7","Type":"ContainerDied","Data":"354132efbae9dd8e2ca6a3cb935de6c0ec4453c2e3bc490a380dd53628b41540"} Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.626775 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.626789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-nc5wg" event={"ID":"98b1d827-2b15-4213-b59a-39e3ac08b962","Type":"ContainerDied","Data":"801ca11c1b5d6e76abe9f47496719a5f691d6d0d169a9aab9fe9a14bd048a4ce"} Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.626864 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="801ca11c1b5d6e76abe9f47496719a5f691d6d0d169a9aab9fe9a14bd048a4ce" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.630189 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ckpfn" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.630243 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ckpfn" event={"ID":"554c2862-dfb9-4910-9d14-3fed242964ed","Type":"ContainerDied","Data":"1cf788ca00c8f12db6fd65d9eccab257ff2f604e8ac4679fc62fa37f019f3830"} Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.630291 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf788ca00c8f12db6fd65d9eccab257ff2f604e8ac4679fc62fa37f019f3830" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.631637 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jmh2r" Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.631704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jmh2r" event={"ID":"e769064f-9469-4183-bd71-52ed11230e0e","Type":"ContainerDied","Data":"0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14"} Mar 17 00:43:39 crc kubenswrapper[4755]: I0317 00:43:39.631749 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0195cb1b857609663e30a31054e41706cc42d4aa7e6fe10d4eee96e5df908c14" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.088052 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.220553 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp4h2\" (UniqueName: \"kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2\") pod \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.220757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts\") pod \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\" (UID: \"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.221761 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" (UID: "ec87d4bf-d241-4b94-b3f7-0f006e4ceb87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.226048 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2" (OuterVolumeSpecName: "kube-api-access-fp4h2") pod "ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" (UID: "ec87d4bf-d241-4b94-b3f7-0f006e4ceb87"). InnerVolumeSpecName "kube-api-access-fp4h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.272340 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" path="/var/lib/kubelet/pods/6d55848a-c15d-4ed3-899b-bfcbb45f13ff/volumes" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.301637 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.310161 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.322809 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.322839 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp4h2\" (UniqueName: \"kubernetes.io/projected/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87-kube-api-access-fp4h2\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.383456 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.423475 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts\") pod \"22aa45d9-d0a3-4dad-98a9-293f6c396229\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.423522 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6vx\" (UniqueName: \"kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx\") pod \"22aa45d9-d0a3-4dad-98a9-293f6c396229\" (UID: \"22aa45d9-d0a3-4dad-98a9-293f6c396229\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.423677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts\") pod \"9964cd2d-4d04-4954-8ba1-0379d75a932f\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.423701 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxx27\" (UniqueName: \"kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27\") pod \"9964cd2d-4d04-4954-8ba1-0379d75a932f\" (UID: \"9964cd2d-4d04-4954-8ba1-0379d75a932f\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.424051 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22aa45d9-d0a3-4dad-98a9-293f6c396229" (UID: "22aa45d9-d0a3-4dad-98a9-293f6c396229"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.424654 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9964cd2d-4d04-4954-8ba1-0379d75a932f" (UID: "9964cd2d-4d04-4954-8ba1-0379d75a932f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.427156 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27" (OuterVolumeSpecName: "kube-api-access-jxx27") pod "9964cd2d-4d04-4954-8ba1-0379d75a932f" (UID: "9964cd2d-4d04-4954-8ba1-0379d75a932f"). InnerVolumeSpecName "kube-api-access-jxx27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.427795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx" (OuterVolumeSpecName: "kube-api-access-rr6vx") pod "22aa45d9-d0a3-4dad-98a9-293f6c396229" (UID: "22aa45d9-d0a3-4dad-98a9-293f6c396229"). InnerVolumeSpecName "kube-api-access-rr6vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.488292 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wn74p"] Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494097 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="init" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494141 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="init" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494158 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494168 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494186 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="554c2862-dfb9-4910-9d14-3fed242964ed" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494194 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="554c2862-dfb9-4910-9d14-3fed242964ed" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494211 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9964cd2d-4d04-4954-8ba1-0379d75a932f" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494220 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9964cd2d-4d04-4954-8ba1-0379d75a932f" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494245 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d32524-c98c-4d9e-abbf-3231c1b18e44" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494252 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d32524-c98c-4d9e-abbf-3231c1b18e44" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494266 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b1d827-2b15-4213-b59a-39e3ac08b962" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494274 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b1d827-2b15-4213-b59a-39e3ac08b962" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494285 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efb20b9-cf6f-4c8e-9afc-92d6713630f2" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494293 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efb20b9-cf6f-4c8e-9afc-92d6713630f2" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494305 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e769064f-9469-4183-bd71-52ed11230e0e" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494312 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e769064f-9469-4183-bd71-52ed11230e0e" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494328 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="dnsmasq-dns" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494336 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="dnsmasq-dns" Mar 17 00:43:40 crc kubenswrapper[4755]: E0317 00:43:40.494347 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aa45d9-d0a3-4dad-98a9-293f6c396229" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494355 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aa45d9-d0a3-4dad-98a9-293f6c396229" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494588 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d55848a-c15d-4ed3-899b-bfcbb45f13ff" containerName="dnsmasq-dns" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494627 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d32524-c98c-4d9e-abbf-3231c1b18e44" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494652 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9964cd2d-4d04-4954-8ba1-0379d75a932f" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494673 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aa45d9-d0a3-4dad-98a9-293f6c396229" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494685 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b1d827-2b15-4213-b59a-39e3ac08b962" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494699 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efb20b9-cf6f-4c8e-9afc-92d6713630f2" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494710 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" containerName="mariadb-account-create-update" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494724 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e769064f-9469-4183-bd71-52ed11230e0e" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.494755 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="554c2862-dfb9-4910-9d14-3fed242964ed" containerName="mariadb-database-create" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.495417 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.498739 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.500658 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wn74p"] Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.525459 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts\") pod \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.525661 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pt2l\" (UniqueName: \"kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l\") pod \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\" (UID: \"5efb20b9-cf6f-4c8e-9afc-92d6713630f2\") " Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.526061 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22aa45d9-d0a3-4dad-98a9-293f6c396229-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.526076 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6vx\" (UniqueName: \"kubernetes.io/projected/22aa45d9-d0a3-4dad-98a9-293f6c396229-kube-api-access-rr6vx\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.526087 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9964cd2d-4d04-4954-8ba1-0379d75a932f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.526095 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxx27\" (UniqueName: \"kubernetes.io/projected/9964cd2d-4d04-4954-8ba1-0379d75a932f-kube-api-access-jxx27\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.526149 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5efb20b9-cf6f-4c8e-9afc-92d6713630f2" (UID: "5efb20b9-cf6f-4c8e-9afc-92d6713630f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.528451 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l" (OuterVolumeSpecName: "kube-api-access-2pt2l") pod "5efb20b9-cf6f-4c8e-9afc-92d6713630f2" (UID: "5efb20b9-cf6f-4c8e-9afc-92d6713630f2"). InnerVolumeSpecName "kube-api-access-2pt2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.628390 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48c7j\" (UniqueName: \"kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.628517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.628563 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pt2l\" (UniqueName: \"kubernetes.io/projected/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-kube-api-access-2pt2l\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.628577 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5efb20b9-cf6f-4c8e-9afc-92d6713630f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.646464 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f0e-account-create-update-pfs54" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.646518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f0e-account-create-update-pfs54" event={"ID":"ec87d4bf-d241-4b94-b3f7-0f006e4ceb87","Type":"ContainerDied","Data":"ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.646594 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea5b6b4243c3fe9c0452d0cf724c12751cb816ebc5fa18033f4ba3f09f667554" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.654504 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-552a-account-create-update-ljkfk" event={"ID":"9964cd2d-4d04-4954-8ba1-0379d75a932f","Type":"ContainerDied","Data":"a0844063a17f686aa2739906ff3dc1c17c0d87db29bebb2b9e9da938d25c13bc"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.654554 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0844063a17f686aa2739906ff3dc1c17c0d87db29bebb2b9e9da938d25c13bc" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.654630 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-552a-account-create-update-ljkfk" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.663918 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerID="e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f" exitCode=0 Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.664015 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerDied","Data":"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.673156 4755 generic.go:334] "Generic (PLEG): container finished" podID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerID="fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750" exitCode=0 Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.673226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerDied","Data":"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.676296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9vkmp" event={"ID":"22aa45d9-d0a3-4dad-98a9-293f6c396229","Type":"ContainerDied","Data":"e5848d4457e24f9804dadad33c6478660c26effe2422d325b2b34bb4b0526cd2"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.676333 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5848d4457e24f9804dadad33c6478660c26effe2422d325b2b34bb4b0526cd2" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.676405 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9vkmp" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.682566 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.682676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e09f-account-create-update-ccmws" event={"ID":"5efb20b9-cf6f-4c8e-9afc-92d6713630f2","Type":"ContainerDied","Data":"6d3217a02a5d2a4ead9e08421b1b49e9feb13d9225ebbfd5ac688eb67192e2b3"} Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.683880 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3217a02a5d2a4ead9e08421b1b49e9feb13d9225ebbfd5ac688eb67192e2b3" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.731107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48c7j\" (UniqueName: \"kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.731294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.732392 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.746936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48c7j\" (UniqueName: \"kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j\") pod \"root-account-create-update-wn74p\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.822430 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:40 crc kubenswrapper[4755]: I0317 00:43:40.970325 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vh58\" (UniqueName: \"kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035746 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035935 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.035989 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle\") pod \"9b280073-a793-4c35-a29b-d56ccf6037a7\" (UID: \"9b280073-a793-4c35-a29b-d56ccf6037a7\") " Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.036679 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.036916 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.042912 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58" (OuterVolumeSpecName: "kube-api-access-9vh58") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "kube-api-access-9vh58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.044711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.065991 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.069845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.072548 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts" (OuterVolumeSpecName: "scripts") pod "9b280073-a793-4c35-a29b-d56ccf6037a7" (UID: "9b280073-a793-4c35-a29b-d56ccf6037a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138173 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138258 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vh58\" (UniqueName: \"kubernetes.io/projected/9b280073-a793-4c35-a29b-d56ccf6037a7-kube-api-access-9vh58\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138275 4755 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138288 4755 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b280073-a793-4c35-a29b-d56ccf6037a7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138298 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138309 4755 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b280073-a793-4c35-a29b-d56ccf6037a7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.138320 4755 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b280073-a793-4c35-a29b-d56ccf6037a7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.296004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wn74p"] Mar 17 00:43:41 crc kubenswrapper[4755]: W0317 00:43:41.298944 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a10493_e311_41ce_9e88_67d04f8d0b22.slice/crio-c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6 WatchSource:0}: Error finding container c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6: Status 404 returned error can't find the container with id c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6 Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.696940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mh59s" event={"ID":"9b280073-a793-4c35-a29b-d56ccf6037a7","Type":"ContainerDied","Data":"8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495"} Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.697000 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5b46c15ad953e51741da2eff02eb7964b20cbc7a645de2357424b241bd7495" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.697075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mh59s" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.701425 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerStarted","Data":"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb"} Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.702652 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.710748 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerStarted","Data":"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53"} Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.711020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.712373 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn74p" event={"ID":"b5a10493-e311-41ce-9e88-67d04f8d0b22","Type":"ContainerStarted","Data":"c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6"} Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.733334 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.441146102 podStartE2EDuration="52.733320086s" podCreationTimestamp="2026-03-17 00:42:49 +0000 UTC" firstStartedPulling="2026-03-17 00:42:53.581503319 +0000 UTC m=+1248.340955602" lastFinishedPulling="2026-03-17 00:43:05.873677303 +0000 UTC m=+1260.633129586" observedRunningTime="2026-03-17 00:43:41.726466706 +0000 UTC m=+1296.485918999" watchObservedRunningTime="2026-03-17 00:43:41.733320086 +0000 UTC m=+1296.492772369" Mar 17 00:43:41 crc kubenswrapper[4755]: I0317 00:43:41.764249 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.337131659 podStartE2EDuration="52.764232718s" podCreationTimestamp="2026-03-17 00:42:49 +0000 UTC" firstStartedPulling="2026-03-17 00:43:05.76313929 +0000 UTC m=+1260.522591573" lastFinishedPulling="2026-03-17 00:43:06.190240349 +0000 UTC m=+1260.949692632" observedRunningTime="2026-03-17 00:43:41.750094516 +0000 UTC m=+1296.509546789" watchObservedRunningTime="2026-03-17 00:43:41.764232718 +0000 UTC m=+1296.523685001" Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.349740 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-cf589d4bf-9httx" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerName="console" containerID="cri-o://0992d224f7ba4a8fa7acc79822891f61afebd3f1cd4b73f8742c69a10570fc8f" gracePeriod=15 Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.528052 4755 patch_prober.go:28] interesting pod/console-cf589d4bf-9httx container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/health\": dial tcp 10.217.0.82:8443: connect: connection refused" start-of-body= Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.528334 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-cf589d4bf-9httx" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.82:8443/health\": dial tcp 10.217.0.82:8443: connect: connection refused" Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.725953 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cf589d4bf-9httx_22b37f74-59b5-4148-9e19-92e3bab357c7/console/0.log" Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.726000 4755 generic.go:334] "Generic (PLEG): container finished" podID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerID="0992d224f7ba4a8fa7acc79822891f61afebd3f1cd4b73f8742c69a10570fc8f" exitCode=2 Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.726958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf589d4bf-9httx" event={"ID":"22b37f74-59b5-4148-9e19-92e3bab357c7","Type":"ContainerDied","Data":"0992d224f7ba4a8fa7acc79822891f61afebd3f1cd4b73f8742c69a10570fc8f"} Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.876369 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.900993 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80ee6df5-abef-4094-aabc-45b15e1ebfcf-etc-swift\") pod \"swift-storage-0\" (UID: \"80ee6df5-abef-4094-aabc-45b15e1ebfcf\") " pod="openstack/swift-storage-0" Mar 17 00:43:42 crc kubenswrapper[4755]: I0317 00:43:42.962544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.025495 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cf589d4bf-9httx_22b37f74-59b5-4148-9e19-92e3bab357c7/console/0.log" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.025548 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.181326 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l5h6\" (UniqueName: \"kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.181624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.181666 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.181729 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.181746 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.182222 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config" (OuterVolumeSpecName: "console-config") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.182240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.182291 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.182415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.182499 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca\") pod \"22b37f74-59b5-4148-9e19-92e3bab357c7\" (UID: \"22b37f74-59b5-4148-9e19-92e3bab357c7\") " Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.183137 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.183147 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.183156 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.183162 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca" (OuterVolumeSpecName: "service-ca") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.186227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.187567 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.187672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6" (OuterVolumeSpecName: "kube-api-access-7l5h6") pod "22b37f74-59b5-4148-9e19-92e3bab357c7" (UID: "22b37f74-59b5-4148-9e19-92e3bab357c7"). InnerVolumeSpecName "kube-api-access-7l5h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.285530 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.285592 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22b37f74-59b5-4148-9e19-92e3bab357c7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.285614 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22b37f74-59b5-4148-9e19-92e3bab357c7-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.285636 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l5h6\" (UniqueName: \"kubernetes.io/projected/22b37f74-59b5-4148-9e19-92e3bab357c7-kube-api-access-7l5h6\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.641382 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 00:43:43 crc kubenswrapper[4755]: W0317 00:43:43.666350 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80ee6df5_abef_4094_aabc_45b15e1ebfcf.slice/crio-76ec430f8cb791c15a0eebe903fc4ae6425958b547693fbbdd514829aa8fbcbb WatchSource:0}: Error finding container 76ec430f8cb791c15a0eebe903fc4ae6425958b547693fbbdd514829aa8fbcbb: Status 404 returned error can't find the container with id 76ec430f8cb791c15a0eebe903fc4ae6425958b547693fbbdd514829aa8fbcbb Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.735344 4755 generic.go:334] "Generic (PLEG): container finished" podID="b5a10493-e311-41ce-9e88-67d04f8d0b22" containerID="45a52e23186cf42097c72b076428529bf89edc62cc39e66e3c5e059d86402650" exitCode=0 Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.735427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn74p" event={"ID":"b5a10493-e311-41ce-9e88-67d04f8d0b22","Type":"ContainerDied","Data":"45a52e23186cf42097c72b076428529bf89edc62cc39e66e3c5e059d86402650"} Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.736737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"76ec430f8cb791c15a0eebe903fc4ae6425958b547693fbbdd514829aa8fbcbb"} Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.745972 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cf589d4bf-9httx_22b37f74-59b5-4148-9e19-92e3bab357c7/console/0.log" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.746075 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cf589d4bf-9httx" event={"ID":"22b37f74-59b5-4148-9e19-92e3bab357c7","Type":"ContainerDied","Data":"2fea5be37b5dd1486d8e087f016f1216a57207b9ce45840071ef9cd653ee17c0"} Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.746124 4755 scope.go:117] "RemoveContainer" containerID="0992d224f7ba4a8fa7acc79822891f61afebd3f1cd4b73f8742c69a10570fc8f" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.746156 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cf589d4bf-9httx" Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.753834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerStarted","Data":"b37d8f2b6983093384b8cf6da747bb60219d87d1c97a68a55ae8828a900c69f0"} Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.786460 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.793159 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cf589d4bf-9httx"] Mar 17 00:43:43 crc kubenswrapper[4755]: I0317 00:43:43.811654 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.997674607 podStartE2EDuration="48.811632939s" podCreationTimestamp="2026-03-17 00:42:55 +0000 UTC" firstStartedPulling="2026-03-17 00:43:06.878370515 +0000 UTC m=+1261.637822808" lastFinishedPulling="2026-03-17 00:43:42.692328857 +0000 UTC m=+1297.451781140" observedRunningTime="2026-03-17 00:43:43.80748406 +0000 UTC m=+1298.566936343" watchObservedRunningTime="2026-03-17 00:43:43.811632939 +0000 UTC m=+1298.571085222" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.171486 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vjbjp"] Mar 17 00:43:44 crc kubenswrapper[4755]: E0317 00:43:44.171957 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b280073-a793-4c35-a29b-d56ccf6037a7" containerName="swift-ring-rebalance" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.171982 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b280073-a793-4c35-a29b-d56ccf6037a7" containerName="swift-ring-rebalance" Mar 17 00:43:44 crc kubenswrapper[4755]: E0317 00:43:44.172007 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerName="console" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.172017 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerName="console" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.172242 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" containerName="console" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.172267 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b280073-a793-4c35-a29b-d56ccf6037a7" containerName="swift-ring-rebalance" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.173111 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.176106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.176238 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rjgsw" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.185008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjbjp"] Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.258795 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b37f74-59b5-4148-9e19-92e3bab357c7" path="/var/lib/kubelet/pods/22b37f74-59b5-4148-9e19-92e3bab357c7/volumes" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.307788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.307867 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2kv\" (UniqueName: \"kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.308054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.308103 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.409645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2kv\" (UniqueName: \"kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.410016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.410112 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.410220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.415881 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.418168 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.418985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.432028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2kv\" (UniqueName: \"kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv\") pod \"glance-db-sync-vjbjp\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:44 crc kubenswrapper[4755]: I0317 00:43:44.488663 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjbjp" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.122067 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjbjp"] Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.180169 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.192207 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.351793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48c7j\" (UniqueName: \"kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j\") pod \"b5a10493-e311-41ce-9e88-67d04f8d0b22\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.351837 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts\") pod \"b5a10493-e311-41ce-9e88-67d04f8d0b22\" (UID: \"b5a10493-e311-41ce-9e88-67d04f8d0b22\") " Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.352683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5a10493-e311-41ce-9e88-67d04f8d0b22" (UID: "b5a10493-e311-41ce-9e88-67d04f8d0b22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.356401 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j" (OuterVolumeSpecName: "kube-api-access-48c7j") pod "b5a10493-e311-41ce-9e88-67d04f8d0b22" (UID: "b5a10493-e311-41ce-9e88-67d04f8d0b22"). InnerVolumeSpecName "kube-api-access-48c7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.454249 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48c7j\" (UniqueName: \"kubernetes.io/projected/b5a10493-e311-41ce-9e88-67d04f8d0b22-kube-api-access-48c7j\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.454279 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5a10493-e311-41ce-9e88-67d04f8d0b22-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.790099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"fa5104bfe19600369beed688b62ea9aea339747d84627bb843fe9bcfee483d46"} Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.790489 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"db5bf8df6210ec8b4d0ed25e061fda2348a2009a2b014318c238d75de60a1597"} Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.790509 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"1876109cab96483e5dbb4247726bddefd777c6f5459a54c286f3be091eaf8e2a"} Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.793686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjbjp" event={"ID":"0991527c-bb4b-498c-86b3-d303cee4eeb1","Type":"ContainerStarted","Data":"f3e23c5a374c4247f2fbf603d289192f5fe30af07b2adf5eb318790b081b9b11"} Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.795647 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn74p" event={"ID":"b5a10493-e311-41ce-9e88-67d04f8d0b22","Type":"ContainerDied","Data":"c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6"} Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.795894 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2907bf1284e2be2da155c714cb52078bf0567ab34f7f587b068edcf0a322ee6" Mar 17 00:43:45 crc kubenswrapper[4755]: I0317 00:43:45.795874 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn74p" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.035082 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9"] Mar 17 00:43:46 crc kubenswrapper[4755]: E0317 00:43:46.036060 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a10493-e311-41ce-9e88-67d04f8d0b22" containerName="mariadb-account-create-update" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.036162 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a10493-e311-41ce-9e88-67d04f8d0b22" containerName="mariadb-account-create-update" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.036500 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a10493-e311-41ce-9e88-67d04f8d0b22" containerName="mariadb-account-create-update" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.045307 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.063748 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9"] Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.166591 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.166650 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c469m\" (UniqueName: \"kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.270176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.270245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c469m\" (UniqueName: \"kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.278647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.286144 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-07d1-account-create-update-bghhg"] Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.287241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.292427 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.302123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c469m\" (UniqueName: \"kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m\") pod \"mysqld-exporter-openstack-cell1-db-create-hfnb9\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.333168 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-07d1-account-create-update-bghhg"] Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.372770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.473589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.473645 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nw6\" (UniqueName: \"kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.574751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.574807 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nw6\" (UniqueName: \"kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.576106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.600098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nw6\" (UniqueName: \"kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6\") pod \"mysqld-exporter-07d1-account-create-update-bghhg\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.682333 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.815884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"03984ce35b11a828712b9b5dbfafb1a60005fa40449d648d384dc1c72715c25f"} Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.839251 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9"] Mar 17 00:43:46 crc kubenswrapper[4755]: I0317 00:43:46.991898 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wn74p"] Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.006590 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wn74p"] Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.146642 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.227663 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-07d1-account-create-update-bghhg"] Mar 17 00:43:47 crc kubenswrapper[4755]: W0317 00:43:47.364931 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1636405_ed65_4deb_81e5_843ae69311f4.slice/crio-fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2 WatchSource:0}: Error finding container fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2: Status 404 returned error can't find the container with id fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2 Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.834637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"8314b03e7361a2f6afd90b525b0b0123c854ce5672469bb44d4cd50a7d68e7f2"} Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.835006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"385716c4060f36bf84e4e436001645cdeea7a92ebda90b2d409ddb673e0ffec3"} Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.837180 4755 generic.go:334] "Generic (PLEG): container finished" podID="b1636405-ed65-4deb-81e5-843ae69311f4" containerID="89ff4548117393a9f67860b9151f3087f5dff8b28c23b018aa6cdfc267530f05" exitCode=0 Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.837256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" event={"ID":"b1636405-ed65-4deb-81e5-843ae69311f4","Type":"ContainerDied","Data":"89ff4548117393a9f67860b9151f3087f5dff8b28c23b018aa6cdfc267530f05"} Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.837283 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" event={"ID":"b1636405-ed65-4deb-81e5-843ae69311f4","Type":"ContainerStarted","Data":"fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2"} Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.843692 4755 generic.go:334] "Generic (PLEG): container finished" podID="cae42150-0ad5-40f3-8ede-bd064e8284dc" containerID="e1894619324db4479d7e5db33bb1c8567bf7ef010167b45fd137b5f1b7392cd9" exitCode=0 Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.843746 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" event={"ID":"cae42150-0ad5-40f3-8ede-bd064e8284dc","Type":"ContainerDied","Data":"e1894619324db4479d7e5db33bb1c8567bf7ef010167b45fd137b5f1b7392cd9"} Mar 17 00:43:47 crc kubenswrapper[4755]: I0317 00:43:47.843776 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" event={"ID":"cae42150-0ad5-40f3-8ede-bd064e8284dc","Type":"ContainerStarted","Data":"84df1101004d2af43866bbba4688316bc704a1267a55615e075be976c26ff383"} Mar 17 00:43:48 crc kubenswrapper[4755]: I0317 00:43:48.269301 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a10493-e311-41ce-9e88-67d04f8d0b22" path="/var/lib/kubelet/pods/b5a10493-e311-41ce-9e88-67d04f8d0b22/volumes" Mar 17 00:43:48 crc kubenswrapper[4755]: I0317 00:43:48.860656 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"12bd9efbecd8da2d60a5b3b38d227a6344dbd362d189c8560807f8bcac12d50b"} Mar 17 00:43:48 crc kubenswrapper[4755]: I0317 00:43:48.860712 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"1c0c59006a86d99b9c07a7a51a71649ebc90b70c0e3718d819b784002c34f78a"} Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.298942 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.305197 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.350310 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c469m\" (UniqueName: \"kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m\") pod \"cae42150-0ad5-40f3-8ede-bd064e8284dc\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.350383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts\") pod \"b1636405-ed65-4deb-81e5-843ae69311f4\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.351838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1636405-ed65-4deb-81e5-843ae69311f4" (UID: "b1636405-ed65-4deb-81e5-843ae69311f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.356764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m" (OuterVolumeSpecName: "kube-api-access-c469m") pod "cae42150-0ad5-40f3-8ede-bd064e8284dc" (UID: "cae42150-0ad5-40f3-8ede-bd064e8284dc"). InnerVolumeSpecName "kube-api-access-c469m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.452576 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9nw6\" (UniqueName: \"kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6\") pod \"b1636405-ed65-4deb-81e5-843ae69311f4\" (UID: \"b1636405-ed65-4deb-81e5-843ae69311f4\") " Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.452690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts\") pod \"cae42150-0ad5-40f3-8ede-bd064e8284dc\" (UID: \"cae42150-0ad5-40f3-8ede-bd064e8284dc\") " Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.453333 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cae42150-0ad5-40f3-8ede-bd064e8284dc" (UID: "cae42150-0ad5-40f3-8ede-bd064e8284dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.453414 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c469m\" (UniqueName: \"kubernetes.io/projected/cae42150-0ad5-40f3-8ede-bd064e8284dc-kube-api-access-c469m\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.453452 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1636405-ed65-4deb-81e5-843ae69311f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.456050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6" (OuterVolumeSpecName: "kube-api-access-v9nw6") pod "b1636405-ed65-4deb-81e5-843ae69311f4" (UID: "b1636405-ed65-4deb-81e5-843ae69311f4"). InnerVolumeSpecName "kube-api-access-v9nw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.554986 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9nw6\" (UniqueName: \"kubernetes.io/projected/b1636405-ed65-4deb-81e5-843ae69311f4-kube-api-access-v9nw6\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.555022 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae42150-0ad5-40f3-8ede-bd064e8284dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.870984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" event={"ID":"b1636405-ed65-4deb-81e5-843ae69311f4","Type":"ContainerDied","Data":"fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2"} Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.871315 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed2e48b2604e7e0bb265aab0115a2a47396bb4988aef122fb26280d54d76ec2" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.871213 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-07d1-account-create-update-bghhg" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.872844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" event={"ID":"cae42150-0ad5-40f3-8ede-bd064e8284dc","Type":"ContainerDied","Data":"84df1101004d2af43866bbba4688316bc704a1267a55615e075be976c26ff383"} Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.872865 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9" Mar 17 00:43:49 crc kubenswrapper[4755]: I0317 00:43:49.872879 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84df1101004d2af43866bbba4688316bc704a1267a55615e075be976c26ff383" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.539101 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9dn6z"] Mar 17 00:43:50 crc kubenswrapper[4755]: E0317 00:43:50.539459 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1636405-ed65-4deb-81e5-843ae69311f4" containerName="mariadb-account-create-update" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.539471 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1636405-ed65-4deb-81e5-843ae69311f4" containerName="mariadb-account-create-update" Mar 17 00:43:50 crc kubenswrapper[4755]: E0317 00:43:50.539491 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae42150-0ad5-40f3-8ede-bd064e8284dc" containerName="mariadb-database-create" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.539504 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae42150-0ad5-40f3-8ede-bd064e8284dc" containerName="mariadb-database-create" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.539654 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1636405-ed65-4deb-81e5-843ae69311f4" containerName="mariadb-account-create-update" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.539670 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae42150-0ad5-40f3-8ede-bd064e8284dc" containerName="mariadb-database-create" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.540255 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.543501 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.567402 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9dn6z"] Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.679546 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.680338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzfc\" (UniqueName: \"kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.781449 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.781568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzfc\" (UniqueName: \"kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.782406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.809183 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzfc\" (UniqueName: \"kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc\") pod \"root-account-create-update-9dn6z\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.858151 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.934875 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"c054913d53fe667cf0194c35ce83785deefc599572e80f765a159431f25cc3b7"} Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.934922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"f7de7129f939fef0bf8c3f0b60be18e1da0b0c74b82589f711e88308af191515"} Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.934931 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"bc83de575c36f077039ec1ca3cf278c7fa2adf88e0782a9465fcdaa77395295a"} Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.934940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"2e2b79f4ba1b0c55954198ce694ec263c5bf6cc1982b3ac845b2168e184e74f1"} Mar 17 00:43:50 crc kubenswrapper[4755]: I0317 00:43:50.934949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"cbffb7cce766b3129d210f32e2c93e7f92af2bd6095b8e03f87231827bd54801"} Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.322133 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9dn6z"] Mar 17 00:43:51 crc kubenswrapper[4755]: W0317 00:43:51.333279 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0759e57_466c_40b5_b756_0b4aad5c3089.slice/crio-0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b WatchSource:0}: Error finding container 0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b: Status 404 returned error can't find the container with id 0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.447376 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.450108 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.454745 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.475934 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.598608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.598930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc249\" (UniqueName: \"kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.600112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.701958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc249\" (UniqueName: \"kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.702021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.702060 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.706645 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.707044 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.722099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc249\" (UniqueName: \"kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249\") pod \"mysqld-exporter-0\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.798028 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.981370 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"8f17cd631dfb37dd8c65b644460ba1a3728abb43256dd03ec0e4ea508d8e91c9"} Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.981735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"80ee6df5-abef-4094-aabc-45b15e1ebfcf","Type":"ContainerStarted","Data":"6f6ecb0fedb0e55b6b9974247e92395fa9aa3e5af65345d4275f9491862f213f"} Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.983801 4755 generic.go:334] "Generic (PLEG): container finished" podID="b0759e57-466c-40b5-b756-0b4aad5c3089" containerID="7227e684059e39a3e3bcdaa7122d87098759c557a3e1d664ec9569d05ed2eb3a" exitCode=0 Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.983872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dn6z" event={"ID":"b0759e57-466c-40b5-b756-0b4aad5c3089","Type":"ContainerDied","Data":"7227e684059e39a3e3bcdaa7122d87098759c557a3e1d664ec9569d05ed2eb3a"} Mar 17 00:43:51 crc kubenswrapper[4755]: I0317 00:43:51.983943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dn6z" event={"ID":"b0759e57-466c-40b5-b756-0b4aad5c3089","Type":"ContainerStarted","Data":"0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b"} Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.030065 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.043636441 podStartE2EDuration="27.030049693s" podCreationTimestamp="2026-03-17 00:43:25 +0000 UTC" firstStartedPulling="2026-03-17 00:43:43.669369733 +0000 UTC m=+1298.428822016" lastFinishedPulling="2026-03-17 00:43:49.655782985 +0000 UTC m=+1304.415235268" observedRunningTime="2026-03-17 00:43:52.02614541 +0000 UTC m=+1306.785597693" watchObservedRunningTime="2026-03-17 00:43:52.030049693 +0000 UTC m=+1306.789501976" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.306344 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.307811 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.309482 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.334391 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.343638 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:43:52 crc kubenswrapper[4755]: W0317 00:43:52.358474 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42a2a1da_3480_4f1d_bba8_a725657e9fcd.slice/crio-23e8a1f114e5f2a203cff854ba2b4957c4175c20683c8e37e7d4569a461d1c9a WatchSource:0}: Error finding container 23e8a1f114e5f2a203cff854ba2b4957c4175c20683c8e37e7d4569a461d1c9a: Status 404 returned error can't find the container with id 23e8a1f114e5f2a203cff854ba2b4957c4175c20683c8e37e7d4569a461d1c9a Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjrjb\" (UniqueName: \"kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417938 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.417990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.519882 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.519975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjrjb\" (UniqueName: \"kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520250 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520934 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.520982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.521071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.521714 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.540957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjrjb\" (UniqueName: \"kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb\") pod \"dnsmasq-dns-77585f5f8c-kpgvs\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:52 crc kubenswrapper[4755]: I0317 00:43:52.630131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.023016 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"42a2a1da-3480-4f1d-bba8-a725657e9fcd","Type":"ContainerStarted","Data":"23e8a1f114e5f2a203cff854ba2b4957c4175c20683c8e37e7d4569a461d1c9a"} Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.161090 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.547718 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.641091 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts\") pod \"b0759e57-466c-40b5-b756-0b4aad5c3089\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.644077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0759e57-466c-40b5-b756-0b4aad5c3089" (UID: "b0759e57-466c-40b5-b756-0b4aad5c3089"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.644618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrzfc\" (UniqueName: \"kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc\") pod \"b0759e57-466c-40b5-b756-0b4aad5c3089\" (UID: \"b0759e57-466c-40b5-b756-0b4aad5c3089\") " Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.646027 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0759e57-466c-40b5-b756-0b4aad5c3089-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.648381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc" (OuterVolumeSpecName: "kube-api-access-lrzfc") pod "b0759e57-466c-40b5-b756-0b4aad5c3089" (UID: "b0759e57-466c-40b5-b756-0b4aad5c3089"). InnerVolumeSpecName "kube-api-access-lrzfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.747373 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrzfc\" (UniqueName: \"kubernetes.io/projected/b0759e57-466c-40b5-b756-0b4aad5c3089-kube-api-access-lrzfc\") on node \"crc\" DevicePath \"\"" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.928739 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.931937 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dvvpc" podUID="a6eae7bd-5007-4389-b4ab-7f296d0fa9ce" containerName="ovn-controller" probeResult="failure" output=< Mar 17 00:43:53 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 17 00:43:53 crc kubenswrapper[4755]: > Mar 17 00:43:53 crc kubenswrapper[4755]: I0317 00:43:53.939765 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bdvbb" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.046806 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerID="19a612e42a3a1178e6eb712baff292bc8da4dbe9a60703eefa0cc84cc618cd8e" exitCode=0 Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.046881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" event={"ID":"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd","Type":"ContainerDied","Data":"19a612e42a3a1178e6eb712baff292bc8da4dbe9a60703eefa0cc84cc618cd8e"} Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.046911 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" event={"ID":"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd","Type":"ContainerStarted","Data":"3ca547f19f0a36e76febb96d7b8409532b8aa1fa060803762ae5db872868a23a"} Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.052786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dn6z" event={"ID":"b0759e57-466c-40b5-b756-0b4aad5c3089","Type":"ContainerDied","Data":"0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b"} Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.052830 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dn6z" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.052838 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0330dd155ddc9e129af58c37e32b524d66de279e44504f12f38ecf55663d889b" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.168587 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dvvpc-config-8p2rk"] Mar 17 00:43:54 crc kubenswrapper[4755]: E0317 00:43:54.169075 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0759e57-466c-40b5-b756-0b4aad5c3089" containerName="mariadb-account-create-update" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.169095 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0759e57-466c-40b5-b756-0b4aad5c3089" containerName="mariadb-account-create-update" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.169313 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0759e57-466c-40b5-b756-0b4aad5c3089" containerName="mariadb-account-create-update" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.170234 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.172322 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.177865 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dvvpc-config-8p2rk"] Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258208 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwlz\" (UniqueName: \"kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258297 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.258330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.359941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.359989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360012 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwlz\" (UniqueName: \"kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360340 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360598 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360627 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.360664 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.361512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.362980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.397119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwlz\" (UniqueName: \"kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz\") pod \"ovn-controller-dvvpc-config-8p2rk\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:54 crc kubenswrapper[4755]: I0317 00:43:54.490555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:43:57 crc kubenswrapper[4755]: I0317 00:43:57.043275 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9dn6z"] Mar 17 00:43:57 crc kubenswrapper[4755]: I0317 00:43:57.053020 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9dn6z"] Mar 17 00:43:57 crc kubenswrapper[4755]: I0317 00:43:57.146339 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 17 00:43:57 crc kubenswrapper[4755]: I0317 00:43:57.148291 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 17 00:43:58 crc kubenswrapper[4755]: I0317 00:43:58.104043 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 17 00:43:58 crc kubenswrapper[4755]: I0317 00:43:58.271686 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0759e57-466c-40b5-b756-0b4aad5c3089" path="/var/lib/kubelet/pods/b0759e57-466c-40b5-b756-0b4aad5c3089/volumes" Mar 17 00:43:58 crc kubenswrapper[4755]: I0317 00:43:58.932937 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dvvpc" podUID="a6eae7bd-5007-4389-b4ab-7f296d0fa9ce" containerName="ovn-controller" probeResult="failure" output=< Mar 17 00:43:58 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 17 00:43:58 crc kubenswrapper[4755]: > Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.142541 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561804-jbzg7"] Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.146677 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.148972 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.149340 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.149920 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.153090 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561804-jbzg7"] Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.280067 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnhv\" (UniqueName: \"kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv\") pod \"auto-csr-approver-29561804-jbzg7\" (UID: \"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac\") " pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.382115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnhv\" (UniqueName: \"kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv\") pod \"auto-csr-approver-29561804-jbzg7\" (UID: \"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac\") " pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.399714 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnhv\" (UniqueName: \"kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv\") pod \"auto-csr-approver-29561804-jbzg7\" (UID: \"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac\") " pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.489074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.511663 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.811658 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.982023 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.982305 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="prometheus" containerID="cri-o://e04dbd8528cf206deadfac630c56d005dfbc5693cdfdbf060355d0b4fff00374" gracePeriod=600 Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.982799 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="thanos-sidecar" containerID="cri-o://b37d8f2b6983093384b8cf6da747bb60219d87d1c97a68a55ae8828a900c69f0" gracePeriod=600 Mar 17 00:44:00 crc kubenswrapper[4755]: I0317 00:44:00.982856 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="config-reloader" containerID="cri-o://3a78a7c9f57b03d419c130d5896347893570c516e901c8ad01defb99516ef10d" gracePeriod=600 Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.065595 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-znbjn"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.068035 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.070608 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.074652 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-znbjn"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.146177 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.129:9090/-/ready\": dial tcp 10.217.0.129:9090: connect: connection refused" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167852 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerID="b37d8f2b6983093384b8cf6da747bb60219d87d1c97a68a55ae8828a900c69f0" exitCode=0 Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167887 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerID="3a78a7c9f57b03d419c130d5896347893570c516e901c8ad01defb99516ef10d" exitCode=0 Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167895 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerID="e04dbd8528cf206deadfac630c56d005dfbc5693cdfdbf060355d0b4fff00374" exitCode=0 Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167912 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerDied","Data":"b37d8f2b6983093384b8cf6da747bb60219d87d1c97a68a55ae8828a900c69f0"} Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerDied","Data":"3a78a7c9f57b03d419c130d5896347893570c516e901c8ad01defb99516ef10d"} Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.167947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerDied","Data":"e04dbd8528cf206deadfac630c56d005dfbc5693cdfdbf060355d0b4fff00374"} Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.217515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brq5f\" (UniqueName: \"kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.217658 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.319117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.319193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brq5f\" (UniqueName: \"kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.320088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.349863 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brq5f\" (UniqueName: \"kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f\") pod \"root-account-create-update-znbjn\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.386469 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.616480 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nn54s"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.618673 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.629891 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nn54s"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.708413 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-35d1-account-create-update-tbps2"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.709969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.722297 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.728085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzn45\" (UniqueName: \"kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.728302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.729430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-35d1-account-create-update-tbps2"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.806225 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d1ca-account-create-update-wkcbb"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.807364 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.809247 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.825486 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d1ca-account-create-update-wkcbb"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.832830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzn45\" (UniqueName: \"kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.832865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.832920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.832967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsz86\" (UniqueName: \"kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.833920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.860766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzn45\" (UniqueName: \"kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45\") pod \"heat-db-create-nn54s\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " pod="openstack/heat-db-create-nn54s" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.896796 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-28cfm"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.898149 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.908983 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28cfm"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.934612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.934899 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bgr\" (UniqueName: \"kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.935057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsz86\" (UniqueName: \"kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.935182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.936027 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.956658 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsz86\" (UniqueName: \"kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86\") pod \"heat-35d1-account-create-update-tbps2\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.984207 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-w7h7z"] Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.985897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:02 crc kubenswrapper[4755]: I0317 00:44:02.998378 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hnc9f"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.000984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.004274 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.004625 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8cwbq" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.004742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.004903 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.013551 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w7h7z"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.028127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hnc9f"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.033895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nn54s" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.037298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.037340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.037424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bgr\" (UniqueName: \"kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.037455 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4p5h\" (UniqueName: \"kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.038128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.044951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.061604 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bgr\" (UniqueName: \"kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr\") pod \"cinder-d1ca-account-create-update-wkcbb\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.100642 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e661-account-create-update-kxqpp"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.102873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.105973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.113673 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qsd9p"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.115666 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.128279 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e661-account-create-update-kxqpp"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139299 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvjss\" (UniqueName: \"kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139381 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdsq\" (UniqueName: \"kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.139749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.140700 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.140851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4p5h\" (UniqueName: \"kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.151449 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qsd9p"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.170199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4p5h\" (UniqueName: \"kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h\") pod \"cinder-db-create-28cfm\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.241519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4234-account-create-update-czqrb"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.242765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.243459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shbsh\" (UniqueName: \"kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.243601 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddmg9\" (UniqueName: \"kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.243767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.243816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdsq\" (UniqueName: \"kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.243918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.244009 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.244177 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.244211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvjss\" (UniqueName: \"kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.244262 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.249987 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.272356 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.276580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.279774 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.292447 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.302591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdsq\" (UniqueName: \"kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq\") pod \"keystone-db-sync-hnc9f\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.305674 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvjss\" (UniqueName: \"kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss\") pod \"barbican-db-create-w7h7z\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.320458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.328671 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4234-account-create-update-czqrb"] Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shbsh\" (UniqueName: \"kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366526 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvbtg\" (UniqueName: \"kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddmg9\" (UniqueName: \"kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.366755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.367545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.368602 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.403180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddmg9\" (UniqueName: \"kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9\") pod \"barbican-e661-account-create-update-kxqpp\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.407805 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shbsh\" (UniqueName: \"kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh\") pod \"neutron-db-create-qsd9p\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.418912 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.440218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.469883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.470008 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbtg\" (UniqueName: \"kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.470882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.524130 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvbtg\" (UniqueName: \"kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg\") pod \"neutron-4234-account-create-update-czqrb\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.604842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.616781 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:03 crc kubenswrapper[4755]: E0317 00:44:03.804686 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="quay.io/prometheus/mysqld-exporter:v0.15.1" Mar 17 00:44:03 crc kubenswrapper[4755]: E0317 00:44:03.804874 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:mysqld-exporter,Image:quay.io/prometheus/mysqld-exporter:v0.15.1,Command:[],Args:[--config.my-cnf=/etc/mysqld-exporter/config.cnf --collect.global_status --collect.global_variables --no-collect.auto_increment.columns --no-collect.binlog_size --no-collect.engine_innodb_status --no-collect.engine_tokudb_status --no-collect.heartbeat --no-collect.heartbeat.utc --no-collect.info_schema.clientstats --no-collect.info_schema.innodb_metrics --no-collect.info_schema.innodb_tablespaces --no-collect.info_schema.innodb_cmp --no-collect.info_schema.innodb_cmpmem --no-collect.info_schema.processlist --no-collect.info_schema.query_response_time --no-collect.info_schema.replica_host --no-collect.info_schema.tables --no-collect.info_schema.tablestats --no-collect.info_schema.schemastats --no-collect.info_schema.userstats --no-collect.mysql.user --no-collect.perf_schema.eventsstatements --no-collect.perf_schema.eventsstatementssum --no-collect.perf_schema.eventswaits --no-collect.perf_schema.file_events --no-collect.perf_schema.file_instances --no-collect.perf_schema.indexiowaits --no-collect.perf_schema.memory_events --no-collect.perf_schema.tableiowaits --no-collect.perf_schema.tablelocks --no-collect.perf_schema.replication_group_members --no-collect.perf_schema.replication_group_member_stats --no-collect.perf_schema.replication_applier_status_by_worker --no-collect.slave_status --no-collect.slave_hosts --no-collect.sys.user_summary],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9bh77h79h77h7h79h5bdh5f7h58bh56h65dh94h5ffh659h545h569h5fbh5c8h547hfchcdh57ch5c6hdhf6h654h75h65fh678hf9h689h5f8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/mysqld-exporter/config.cnf,SubPath:config.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/mysqld-exporter/web.cnf,SubPath:web.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wc249,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mysqld-exporter-0_openstack(42a2a1da-3480-4f1d-bba8-a725657e9fcd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 00:44:03 crc kubenswrapper[4755]: E0317 00:44:03.806889 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysqld-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/mysqld-exporter-0" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" Mar 17 00:44:03 crc kubenswrapper[4755]: I0317 00:44:03.946962 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dvvpc" podUID="a6eae7bd-5007-4389-b4ab-7f296d0fa9ce" containerName="ovn-controller" probeResult="failure" output=< Mar 17 00:44:03 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 17 00:44:03 crc kubenswrapper[4755]: > Mar 17 00:44:04 crc kubenswrapper[4755]: E0317 00:44:04.279888 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysqld-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/prometheus/mysqld-exporter:v0.15.1\\\"\"" pod="openstack/mysqld-exporter-0" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.380069 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.496961 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497029 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497054 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497164 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg5vc\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497313 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.497370 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config\") pod \"e6625cad-73d0-4753-8a77-4d47344b7fad\" (UID: \"e6625cad-73d0-4753-8a77-4d47344b7fad\") " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.508597 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.508920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.509185 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.527541 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.540583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config" (OuterVolumeSpecName: "config") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.542355 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out" (OuterVolumeSpecName: "config-out") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.543967 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc" (OuterVolumeSpecName: "kube-api-access-gg5vc") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "kube-api-access-gg5vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.552856 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.572175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config" (OuterVolumeSpecName: "web-config") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.581121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e6625cad-73d0-4753-8a77-4d47344b7fad" (UID: "e6625cad-73d0-4753-8a77-4d47344b7fad"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599853 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg5vc\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-kube-api-access-gg5vc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599896 4755 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6625cad-73d0-4753-8a77-4d47344b7fad-config-out\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599909 4755 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-web-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599921 4755 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599938 4755 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599951 4755 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.599963 4755 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6625cad-73d0-4753-8a77-4d47344b7fad-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.600394 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.600418 4755 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e6625cad-73d0-4753-8a77-4d47344b7fad-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.600431 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6625cad-73d0-4753-8a77-4d47344b7fad-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.626574 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.701910 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.868958 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561804-jbzg7"] Mar 17 00:44:04 crc kubenswrapper[4755]: I0317 00:44:04.946229 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hnc9f"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.268626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" event={"ID":"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd","Type":"ContainerStarted","Data":"4a4d975184340c035241f9956f12740eafe39b9e26831f9a1fc826c3cfe177e4"} Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.271561 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.298215 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6625cad-73d0-4753-8a77-4d47344b7fad","Type":"ContainerDied","Data":"8b372c42c9e3c0d14bd0e7394e6abea5ec5c989461adbfcfb98cdde4104db347"} Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.298270 4755 scope.go:117] "RemoveContainer" containerID="b37d8f2b6983093384b8cf6da747bb60219d87d1c97a68a55ae8828a900c69f0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.298422 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.305327 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podStartSLOduration=13.305298231 podStartE2EDuration="13.305298231s" podCreationTimestamp="2026-03-17 00:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:05.301392368 +0000 UTC m=+1320.060844651" watchObservedRunningTime="2026-03-17 00:44:05.305298231 +0000 UTC m=+1320.064750514" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.313193 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" event={"ID":"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac","Type":"ContainerStarted","Data":"a68cd8434a75777fb54bbf1c6610a98c64d1eb6f25aeabea2e8ad678ec9bfabc"} Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.338563 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjbjp" event={"ID":"0991527c-bb4b-498c-86b3-d303cee4eeb1","Type":"ContainerStarted","Data":"5c2ca20c1adbcd47f53c61b151002dd6fcfa14ce84372a1e1c2c02eb52aead30"} Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.354507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hnc9f" event={"ID":"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca","Type":"ContainerStarted","Data":"de620c321b79dc5527e8c782ece2d43c2c3b937f1be69b8ccb1061c43f100a05"} Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.381102 4755 scope.go:117] "RemoveContainer" containerID="3a78a7c9f57b03d419c130d5896347893570c516e901c8ad01defb99516ef10d" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.387505 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.406617 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.419957 4755 scope.go:117] "RemoveContainer" containerID="e04dbd8528cf206deadfac630c56d005dfbc5693cdfdbf060355d0b4fff00374" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.421989 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vjbjp" podStartSLOduration=2.525440297 podStartE2EDuration="21.421966986s" podCreationTimestamp="2026-03-17 00:43:44 +0000 UTC" firstStartedPulling="2026-03-17 00:43:45.137383704 +0000 UTC m=+1299.896835987" lastFinishedPulling="2026-03-17 00:44:04.033910393 +0000 UTC m=+1318.793362676" observedRunningTime="2026-03-17 00:44:05.364866646 +0000 UTC m=+1320.124318929" watchObservedRunningTime="2026-03-17 00:44:05.421966986 +0000 UTC m=+1320.181419259" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.439668 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:05 crc kubenswrapper[4755]: E0317 00:44:05.440157 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="thanos-sidecar" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440174 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="thanos-sidecar" Mar 17 00:44:05 crc kubenswrapper[4755]: E0317 00:44:05.440195 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="prometheus" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440202 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="prometheus" Mar 17 00:44:05 crc kubenswrapper[4755]: E0317 00:44:05.440213 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="config-reloader" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440220 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="config-reloader" Mar 17 00:44:05 crc kubenswrapper[4755]: E0317 00:44:05.440230 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="init-config-reloader" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440236 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="init-config-reloader" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440475 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="config-reloader" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440495 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="prometheus" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.440505 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" containerName="thanos-sidecar" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.442216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.444258 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.444532 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.444641 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.444756 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kdfvt" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.448301 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.449258 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.449468 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.449620 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.450636 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.454976 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.500039 4755 scope.go:117] "RemoveContainer" containerID="4807d127d9a739befc515b172da1f6bb129a30b65aac8982d795fdc0e475d2f2" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.608629 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w7h7z"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620847 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1ebdcce-406b-4668-a325-f1f4318b2d69-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.620996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zn9\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-kube-api-access-w6zn9\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621282 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621429 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.621684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.633766 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dvvpc-config-8p2rk"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733609 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zn9\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-kube-api-access-w6zn9\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733693 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1ebdcce-406b-4668-a325-f1f4318b2d69-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.733977 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.759071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.776112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.778099 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.797620 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.799099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.799662 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.807417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1ebdcce-406b-4668-a325-f1f4318b2d69-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.808259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.808766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.822783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4234-account-create-update-czqrb"] Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.830052 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1ebdcce-406b-4668-a325-f1f4318b2d69-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.836147 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: W0317 00:44:05.845805 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaff8a0c_71c2_4d19_b90d_98663b80ba85.slice/crio-8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d WatchSource:0}: Error finding container 8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d: Status 404 returned error can't find the container with id 8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.854317 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1ebdcce-406b-4668-a325-f1f4318b2d69-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.876840 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zn9\" (UniqueName: \"kubernetes.io/projected/c1ebdcce-406b-4668-a325-f1f4318b2d69-kube-api-access-w6zn9\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:05 crc kubenswrapper[4755]: I0317 00:44:05.987256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c1ebdcce-406b-4668-a325-f1f4318b2d69\") " pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.018573 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d1ca-account-create-update-wkcbb"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.031616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nn54s"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.045614 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-35d1-account-create-update-tbps2"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.057974 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28cfm"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.066203 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-znbjn"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.071500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.084604 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e661-account-create-update-kxqpp"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.100528 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qsd9p"] Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.286556 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6625cad-73d0-4753-8a77-4d47344b7fad" path="/var/lib/kubelet/pods/e6625cad-73d0-4753-8a77-4d47344b7fad/volumes" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.375740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nn54s" event={"ID":"12fba3ab-a03b-40ab-8ed5-ce2b667003da","Type":"ContainerStarted","Data":"73a7ae161de7b18ac3f5e5c105c0c92c1d0d167afca0153118e7a1c9e1dfc2bf"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.377648 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-znbjn" event={"ID":"7a46d626-29c2-42a4-88a0-e01284c086fa","Type":"ContainerStarted","Data":"38821881f88cd687a7c774e8c25d26159688054996d2d5a13c14f80184025f23"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.378795 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dvvpc-config-8p2rk" event={"ID":"daff8a0c-71c2-4d19-b90d-98663b80ba85","Type":"ContainerStarted","Data":"8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.379895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d1ca-account-create-update-wkcbb" event={"ID":"e5f02541-9d51-424d-b558-15bb417ad5b2","Type":"ContainerStarted","Data":"d0d0576b3e1aeedb7930827df19c198f8002948195c4b3ee0d07e1908cab2095"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.379978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d1ca-account-create-update-wkcbb" event={"ID":"e5f02541-9d51-424d-b558-15bb417ad5b2","Type":"ContainerStarted","Data":"954c3b8aca5c73e6fc65358c1987df2a9c7e6e10fea06c8292a84568e4b72a5b"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.380745 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28cfm" event={"ID":"2a5673ab-42bb-4268-a723-b2df9c13904b","Type":"ContainerStarted","Data":"3d5343ae74f653d6e5cfdb8d86c23ae191018f3034953db905a2d7569ff3316e"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.382201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-35d1-account-create-update-tbps2" event={"ID":"48efdbc2-4211-40f2-8f38-c7e2199852ba","Type":"ContainerStarted","Data":"c76851ae019f38a6ec96b8b057c29ae227371bd82e63fd9f695aa2e7be2291be"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.382242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-35d1-account-create-update-tbps2" event={"ID":"48efdbc2-4211-40f2-8f38-c7e2199852ba","Type":"ContainerStarted","Data":"7952621aae59f063c7453472150c225f9db2533812e69718b9db16a66b8c8392"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.386185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e661-account-create-update-kxqpp" event={"ID":"722c192f-3110-4799-a25c-def078351bbc","Type":"ContainerStarted","Data":"45a84a0b3dd440350109f96cc5b3852cd8bec579ab4fee8df8dd027b4ba953b9"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.409292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7h7z" event={"ID":"f10d407a-c50c-4f3e-955a-92b2f75d2fd6","Type":"ContainerStarted","Data":"48e4f39385e61999e762d4bb7949f3dc1469a7a0344c13b46474f658f18e93eb"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.409352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7h7z" event={"ID":"f10d407a-c50c-4f3e-955a-92b2f75d2fd6","Type":"ContainerStarted","Data":"e16ed2814ebc59f5a3015b84ef76183ecf14d082bf79020ca548d4bde5e398f2"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.411150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4234-account-create-update-czqrb" event={"ID":"1aa0451d-3211-4eb1-86ab-ca7a573632fc","Type":"ContainerStarted","Data":"4aa0ce8fdd369fc87c3bf86f5cd1ca2eb2f2d1bab6483326de8406125a5463cb"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.412270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qsd9p" event={"ID":"ab11e25f-07a9-431a-bdbd-bafb6d673e5c","Type":"ContainerStarted","Data":"9952ec735398b41d2fff695a389fa19e3a418a6d198acf5109578a67f6e2fb53"} Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.490495 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-w7h7z" podStartSLOduration=4.490413342 podStartE2EDuration="4.490413342s" podCreationTimestamp="2026-03-17 00:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:06.48615744 +0000 UTC m=+1321.245609723" watchObservedRunningTime="2026-03-17 00:44:06.490413342 +0000 UTC m=+1321.249865615" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.513416 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-35d1-account-create-update-tbps2" podStartSLOduration=4.513399075 podStartE2EDuration="4.513399075s" podCreationTimestamp="2026-03-17 00:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:06.509043491 +0000 UTC m=+1321.268495774" watchObservedRunningTime="2026-03-17 00:44:06.513399075 +0000 UTC m=+1321.272851348" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.538129 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d1ca-account-create-update-wkcbb" podStartSLOduration=4.538110195 podStartE2EDuration="4.538110195s" podCreationTimestamp="2026-03-17 00:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:06.534311115 +0000 UTC m=+1321.293763398" watchObservedRunningTime="2026-03-17 00:44:06.538110195 +0000 UTC m=+1321.297562478" Mar 17 00:44:06 crc kubenswrapper[4755]: I0317 00:44:06.905366 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 17 00:44:06 crc kubenswrapper[4755]: W0317 00:44:06.960192 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ebdcce_406b_4668_a325_f1f4318b2d69.slice/crio-24b80ca29f5e0cbdc0b3c0aa0bbc0a91ddc00cc030d7f9f8daec91c007e55338 WatchSource:0}: Error finding container 24b80ca29f5e0cbdc0b3c0aa0bbc0a91ddc00cc030d7f9f8daec91c007e55338: Status 404 returned error can't find the container with id 24b80ca29f5e0cbdc0b3c0aa0bbc0a91ddc00cc030d7f9f8daec91c007e55338 Mar 17 00:44:07 crc kubenswrapper[4755]: E0317 00:44:07.231226 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab11e25f_07a9_431a_bdbd_bafb6d673e5c.slice/crio-96573cdd374881bd73ad187aad6bb082632929bb78e3a62ee3b389968af611e7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a5673ab_42bb_4268_a723_b2df9c13904b.slice/crio-a96f4d3c3591fb53aeeeee5aca7f4fb83c7d2a1d7084b45fc6a7a7e573d11d39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab11e25f_07a9_431a_bdbd_bafb6d673e5c.slice/crio-conmon-96573cdd374881bd73ad187aad6bb082632929bb78e3a62ee3b389968af611e7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12fba3ab_a03b_40ab_8ed5_ce2b667003da.slice/crio-88d14a489b5e70cb05fb7d650deb22168ba06563c3275b85289e89aba141bd6a.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.435457 4755 generic.go:334] "Generic (PLEG): container finished" podID="12fba3ab-a03b-40ab-8ed5-ce2b667003da" containerID="88d14a489b5e70cb05fb7d650deb22168ba06563c3275b85289e89aba141bd6a" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.435532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nn54s" event={"ID":"12fba3ab-a03b-40ab-8ed5-ce2b667003da","Type":"ContainerDied","Data":"88d14a489b5e70cb05fb7d650deb22168ba06563c3275b85289e89aba141bd6a"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.447103 4755 generic.go:334] "Generic (PLEG): container finished" podID="f10d407a-c50c-4f3e-955a-92b2f75d2fd6" containerID="48e4f39385e61999e762d4bb7949f3dc1469a7a0344c13b46474f658f18e93eb" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.447170 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7h7z" event={"ID":"f10d407a-c50c-4f3e-955a-92b2f75d2fd6","Type":"ContainerDied","Data":"48e4f39385e61999e762d4bb7949f3dc1469a7a0344c13b46474f658f18e93eb"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.455567 4755 generic.go:334] "Generic (PLEG): container finished" podID="ab11e25f-07a9-431a-bdbd-bafb6d673e5c" containerID="96573cdd374881bd73ad187aad6bb082632929bb78e3a62ee3b389968af611e7" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.455661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qsd9p" event={"ID":"ab11e25f-07a9-431a-bdbd-bafb6d673e5c","Type":"ContainerDied","Data":"96573cdd374881bd73ad187aad6bb082632929bb78e3a62ee3b389968af611e7"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.459339 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5f02541-9d51-424d-b558-15bb417ad5b2" containerID="d0d0576b3e1aeedb7930827df19c198f8002948195c4b3ee0d07e1908cab2095" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.459411 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d1ca-account-create-update-wkcbb" event={"ID":"e5f02541-9d51-424d-b558-15bb417ad5b2","Type":"ContainerDied","Data":"d0d0576b3e1aeedb7930827df19c198f8002948195c4b3ee0d07e1908cab2095"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.471926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerStarted","Data":"24b80ca29f5e0cbdc0b3c0aa0bbc0a91ddc00cc030d7f9f8daec91c007e55338"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.475861 4755 generic.go:334] "Generic (PLEG): container finished" podID="2a5673ab-42bb-4268-a723-b2df9c13904b" containerID="a96f4d3c3591fb53aeeeee5aca7f4fb83c7d2a1d7084b45fc6a7a7e573d11d39" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.476042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28cfm" event={"ID":"2a5673ab-42bb-4268-a723-b2df9c13904b","Type":"ContainerDied","Data":"a96f4d3c3591fb53aeeeee5aca7f4fb83c7d2a1d7084b45fc6a7a7e573d11d39"} Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.491788 4755 generic.go:334] "Generic (PLEG): container finished" podID="48efdbc2-4211-40f2-8f38-c7e2199852ba" containerID="c76851ae019f38a6ec96b8b057c29ae227371bd82e63fd9f695aa2e7be2291be" exitCode=0 Mar 17 00:44:07 crc kubenswrapper[4755]: I0317 00:44:07.492733 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-35d1-account-create-update-tbps2" event={"ID":"48efdbc2-4211-40f2-8f38-c7e2199852ba","Type":"ContainerDied","Data":"c76851ae019f38a6ec96b8b057c29ae227371bd82e63fd9f695aa2e7be2291be"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.506894 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a46d626-29c2-42a4-88a0-e01284c086fa" containerID="d6bb708814cc468246420939d8e13273cca9282f63af238e2fba0887d65641b9" exitCode=0 Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.506947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-znbjn" event={"ID":"7a46d626-29c2-42a4-88a0-e01284c086fa","Type":"ContainerDied","Data":"d6bb708814cc468246420939d8e13273cca9282f63af238e2fba0887d65641b9"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.511299 4755 generic.go:334] "Generic (PLEG): container finished" podID="1aa0451d-3211-4eb1-86ab-ca7a573632fc" containerID="8d2decf981321ce435c9a278695392e690d26970478d9859a22c8e7bdb116afd" exitCode=0 Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.511388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4234-account-create-update-czqrb" event={"ID":"1aa0451d-3211-4eb1-86ab-ca7a573632fc","Type":"ContainerDied","Data":"8d2decf981321ce435c9a278695392e690d26970478d9859a22c8e7bdb116afd"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.513672 4755 generic.go:334] "Generic (PLEG): container finished" podID="daff8a0c-71c2-4d19-b90d-98663b80ba85" containerID="ae14e1774c8730637bf84f356775ffbb7884d44f92f657ceed5908d174039c72" exitCode=0 Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.513788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dvvpc-config-8p2rk" event={"ID":"daff8a0c-71c2-4d19-b90d-98663b80ba85","Type":"ContainerDied","Data":"ae14e1774c8730637bf84f356775ffbb7884d44f92f657ceed5908d174039c72"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.516418 4755 generic.go:334] "Generic (PLEG): container finished" podID="6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" containerID="edf8e8d35128c3c0f643bc4f209edab536c9a45653b99c0ad5aa889e46b99df0" exitCode=0 Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.516546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" event={"ID":"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac","Type":"ContainerDied","Data":"edf8e8d35128c3c0f643bc4f209edab536c9a45653b99c0ad5aa889e46b99df0"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.525237 4755 generic.go:334] "Generic (PLEG): container finished" podID="722c192f-3110-4799-a25c-def078351bbc" containerID="b3e1540ba80b3dbf83910ead5a6a8ce7235bf895fd62eee767db77e469e91421" exitCode=0 Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.525316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e661-account-create-update-kxqpp" event={"ID":"722c192f-3110-4799-a25c-def078351bbc","Type":"ContainerDied","Data":"b3e1540ba80b3dbf83910ead5a6a8ce7235bf895fd62eee767db77e469e91421"} Mar 17 00:44:08 crc kubenswrapper[4755]: I0317 00:44:08.914264 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dvvpc" Mar 17 00:44:10 crc kubenswrapper[4755]: I0317 00:44:10.553649 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerStarted","Data":"fe99e945fd2ee0f21f2f8df752e4c1e55b91389f967d077cee1d3d8978ba28da"} Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.871894 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.878845 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.886911 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.919319 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.939085 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.963760 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.973541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.986419 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:11 crc kubenswrapper[4755]: I0317 00:44:11.993544 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nn54s" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.013364 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.019661 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055098 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055143 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66bgr\" (UniqueName: \"kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr\") pod \"e5f02541-9d51-424d-b558-15bb417ad5b2\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055200 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4p5h\" (UniqueName: \"kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h\") pod \"2a5673ab-42bb-4268-a723-b2df9c13904b\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055188 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055219 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055266 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055292 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnnhv\" (UniqueName: \"kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv\") pod \"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac\" (UID: \"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qwlz\" (UniqueName: \"kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts\") pod \"2a5673ab-42bb-4268-a723-b2df9c13904b\" (UID: \"2a5673ab-42bb-4268-a723-b2df9c13904b\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055483 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn\") pod \"daff8a0c-71c2-4d19-b90d-98663b80ba85\" (UID: \"daff8a0c-71c2-4d19-b90d-98663b80ba85\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts\") pod \"e5f02541-9d51-424d-b558-15bb417ad5b2\" (UID: \"e5f02541-9d51-424d-b558-15bb417ad5b2\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055538 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddmg9\" (UniqueName: \"kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9\") pod \"722c192f-3110-4799-a25c-def078351bbc\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055559 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts\") pod \"722c192f-3110-4799-a25c-def078351bbc\" (UID: \"722c192f-3110-4799-a25c-def078351bbc\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.055908 4755 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.056366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "722c192f-3110-4799-a25c-def078351bbc" (UID: "722c192f-3110-4799-a25c-def078351bbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.056769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a5673ab-42bb-4268-a723-b2df9c13904b" (UID: "2a5673ab-42bb-4268-a723-b2df9c13904b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.056804 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.057133 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5f02541-9d51-424d-b558-15bb417ad5b2" (UID: "e5f02541-9d51-424d-b558-15bb417ad5b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.057412 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.057797 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run" (OuterVolumeSpecName: "var-run") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.059677 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts" (OuterVolumeSpecName: "scripts") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.061131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz" (OuterVolumeSpecName: "kube-api-access-8qwlz") pod "daff8a0c-71c2-4d19-b90d-98663b80ba85" (UID: "daff8a0c-71c2-4d19-b90d-98663b80ba85"). InnerVolumeSpecName "kube-api-access-8qwlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.064035 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h" (OuterVolumeSpecName: "kube-api-access-k4p5h") pod "2a5673ab-42bb-4268-a723-b2df9c13904b" (UID: "2a5673ab-42bb-4268-a723-b2df9c13904b"). InnerVolumeSpecName "kube-api-access-k4p5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.069301 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv" (OuterVolumeSpecName: "kube-api-access-qnnhv") pod "6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" (UID: "6ab9a2f0-b327-4c8a-a7ec-97930918a1ac"). InnerVolumeSpecName "kube-api-access-qnnhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.070664 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9" (OuterVolumeSpecName: "kube-api-access-ddmg9") pod "722c192f-3110-4799-a25c-def078351bbc" (UID: "722c192f-3110-4799-a25c-def078351bbc"). InnerVolumeSpecName "kube-api-access-ddmg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.087983 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr" (OuterVolumeSpecName: "kube-api-access-66bgr") pod "e5f02541-9d51-424d-b558-15bb417ad5b2" (UID: "e5f02541-9d51-424d-b558-15bb417ad5b2"). InnerVolumeSpecName "kube-api-access-66bgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts\") pod \"7a46d626-29c2-42a4-88a0-e01284c086fa\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts\") pod \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156613 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shbsh\" (UniqueName: \"kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh\") pod \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzn45\" (UniqueName: \"kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45\") pod \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\" (UID: \"12fba3ab-a03b-40ab-8ed5-ce2b667003da\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156683 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvjss\" (UniqueName: \"kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss\") pod \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts\") pod \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\" (UID: \"ab11e25f-07a9-431a-bdbd-bafb6d673e5c\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts\") pod \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156807 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts\") pod \"48efdbc2-4211-40f2-8f38-c7e2199852ba\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156824 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brq5f\" (UniqueName: \"kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f\") pod \"7a46d626-29c2-42a4-88a0-e01284c086fa\" (UID: \"7a46d626-29c2-42a4-88a0-e01284c086fa\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvbtg\" (UniqueName: \"kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg\") pod \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\" (UID: \"1aa0451d-3211-4eb1-86ab-ca7a573632fc\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156897 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsz86\" (UniqueName: \"kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86\") pod \"48efdbc2-4211-40f2-8f38-c7e2199852ba\" (UID: \"48efdbc2-4211-40f2-8f38-c7e2199852ba\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.156933 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts\") pod \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\" (UID: \"f10d407a-c50c-4f3e-955a-92b2f75d2fd6\") " Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157273 4755 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157287 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f02541-9d51-424d-b558-15bb417ad5b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157299 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddmg9\" (UniqueName: \"kubernetes.io/projected/722c192f-3110-4799-a25c-def078351bbc-kube-api-access-ddmg9\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157309 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/722c192f-3110-4799-a25c-def078351bbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157317 4755 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157327 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66bgr\" (UniqueName: \"kubernetes.io/projected/e5f02541-9d51-424d-b558-15bb417ad5b2-kube-api-access-66bgr\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157337 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4p5h\" (UniqueName: \"kubernetes.io/projected/2a5673ab-42bb-4268-a723-b2df9c13904b-kube-api-access-k4p5h\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157346 4755 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/daff8a0c-71c2-4d19-b90d-98663b80ba85-var-run\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157354 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daff8a0c-71c2-4d19-b90d-98663b80ba85-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157364 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnnhv\" (UniqueName: \"kubernetes.io/projected/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac-kube-api-access-qnnhv\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157372 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qwlz\" (UniqueName: \"kubernetes.io/projected/daff8a0c-71c2-4d19-b90d-98663b80ba85-kube-api-access-8qwlz\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157380 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a5673ab-42bb-4268-a723-b2df9c13904b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f10d407a-c50c-4f3e-955a-92b2f75d2fd6" (UID: "f10d407a-c50c-4f3e-955a-92b2f75d2fd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.157927 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aa0451d-3211-4eb1-86ab-ca7a573632fc" (UID: "1aa0451d-3211-4eb1-86ab-ca7a573632fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.158204 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12fba3ab-a03b-40ab-8ed5-ce2b667003da" (UID: "12fba3ab-a03b-40ab-8ed5-ce2b667003da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.158213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a46d626-29c2-42a4-88a0-e01284c086fa" (UID: "7a46d626-29c2-42a4-88a0-e01284c086fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.158290 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48efdbc2-4211-40f2-8f38-c7e2199852ba" (UID: "48efdbc2-4211-40f2-8f38-c7e2199852ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.159078 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab11e25f-07a9-431a-bdbd-bafb6d673e5c" (UID: "ab11e25f-07a9-431a-bdbd-bafb6d673e5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.163869 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh" (OuterVolumeSpecName: "kube-api-access-shbsh") pod "ab11e25f-07a9-431a-bdbd-bafb6d673e5c" (UID: "ab11e25f-07a9-431a-bdbd-bafb6d673e5c"). InnerVolumeSpecName "kube-api-access-shbsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.163930 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86" (OuterVolumeSpecName: "kube-api-access-bsz86") pod "48efdbc2-4211-40f2-8f38-c7e2199852ba" (UID: "48efdbc2-4211-40f2-8f38-c7e2199852ba"). InnerVolumeSpecName "kube-api-access-bsz86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.163966 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45" (OuterVolumeSpecName: "kube-api-access-wzn45") pod "12fba3ab-a03b-40ab-8ed5-ce2b667003da" (UID: "12fba3ab-a03b-40ab-8ed5-ce2b667003da"). InnerVolumeSpecName "kube-api-access-wzn45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.163982 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f" (OuterVolumeSpecName: "kube-api-access-brq5f") pod "7a46d626-29c2-42a4-88a0-e01284c086fa" (UID: "7a46d626-29c2-42a4-88a0-e01284c086fa"). InnerVolumeSpecName "kube-api-access-brq5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.164014 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss" (OuterVolumeSpecName: "kube-api-access-fvjss") pod "f10d407a-c50c-4f3e-955a-92b2f75d2fd6" (UID: "f10d407a-c50c-4f3e-955a-92b2f75d2fd6"). InnerVolumeSpecName "kube-api-access-fvjss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.164267 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg" (OuterVolumeSpecName: "kube-api-access-tvbtg") pod "1aa0451d-3211-4eb1-86ab-ca7a573632fc" (UID: "1aa0451d-3211-4eb1-86ab-ca7a573632fc"). InnerVolumeSpecName "kube-api-access-tvbtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258711 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shbsh\" (UniqueName: \"kubernetes.io/projected/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-kube-api-access-shbsh\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258744 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzn45\" (UniqueName: \"kubernetes.io/projected/12fba3ab-a03b-40ab-8ed5-ce2b667003da-kube-api-access-wzn45\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258756 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvjss\" (UniqueName: \"kubernetes.io/projected/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-kube-api-access-fvjss\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258769 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11e25f-07a9-431a-bdbd-bafb6d673e5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258781 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa0451d-3211-4eb1-86ab-ca7a573632fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258792 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48efdbc2-4211-40f2-8f38-c7e2199852ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258802 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brq5f\" (UniqueName: \"kubernetes.io/projected/7a46d626-29c2-42a4-88a0-e01284c086fa-kube-api-access-brq5f\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258815 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvbtg\" (UniqueName: \"kubernetes.io/projected/1aa0451d-3211-4eb1-86ab-ca7a573632fc-kube-api-access-tvbtg\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258826 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsz86\" (UniqueName: \"kubernetes.io/projected/48efdbc2-4211-40f2-8f38-c7e2199852ba-kube-api-access-bsz86\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258837 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10d407a-c50c-4f3e-955a-92b2f75d2fd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258848 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a46d626-29c2-42a4-88a0-e01284c086fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.258859 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12fba3ab-a03b-40ab-8ed5-ce2b667003da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.573654 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-35d1-account-create-update-tbps2" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.574344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-35d1-account-create-update-tbps2" event={"ID":"48efdbc2-4211-40f2-8f38-c7e2199852ba","Type":"ContainerDied","Data":"7952621aae59f063c7453472150c225f9db2533812e69718b9db16a66b8c8392"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.574392 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7952621aae59f063c7453472150c225f9db2533812e69718b9db16a66b8c8392" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.576397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nn54s" event={"ID":"12fba3ab-a03b-40ab-8ed5-ce2b667003da","Type":"ContainerDied","Data":"73a7ae161de7b18ac3f5e5c105c0c92c1d0d167afca0153118e7a1c9e1dfc2bf"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.576429 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73a7ae161de7b18ac3f5e5c105c0c92c1d0d167afca0153118e7a1c9e1dfc2bf" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.576492 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nn54s" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.578923 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4234-account-create-update-czqrb" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.578936 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4234-account-create-update-czqrb" event={"ID":"1aa0451d-3211-4eb1-86ab-ca7a573632fc","Type":"ContainerDied","Data":"4aa0ce8fdd369fc87c3bf86f5cd1ca2eb2f2d1bab6483326de8406125a5463cb"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.578975 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aa0ce8fdd369fc87c3bf86f5cd1ca2eb2f2d1bab6483326de8406125a5463cb" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.581401 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qsd9p" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.581423 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qsd9p" event={"ID":"ab11e25f-07a9-431a-bdbd-bafb6d673e5c","Type":"ContainerDied","Data":"9952ec735398b41d2fff695a389fa19e3a418a6d198acf5109578a67f6e2fb53"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.581488 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9952ec735398b41d2fff695a389fa19e3a418a6d198acf5109578a67f6e2fb53" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.583201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dvvpc-config-8p2rk" event={"ID":"daff8a0c-71c2-4d19-b90d-98663b80ba85","Type":"ContainerDied","Data":"8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.583287 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b445f7add1a28956a30ee04f43b0ca1c27dce6710f67acceae9413796957e0d" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.583228 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dvvpc-config-8p2rk" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.585172 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" event={"ID":"6ab9a2f0-b327-4c8a-a7ec-97930918a1ac","Type":"ContainerDied","Data":"a68cd8434a75777fb54bbf1c6610a98c64d1eb6f25aeabea2e8ad678ec9bfabc"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.585252 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a68cd8434a75777fb54bbf1c6610a98c64d1eb6f25aeabea2e8ad678ec9bfabc" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.585257 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561804-jbzg7" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.586636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28cfm" event={"ID":"2a5673ab-42bb-4268-a723-b2df9c13904b","Type":"ContainerDied","Data":"3d5343ae74f653d6e5cfdb8d86c23ae191018f3034953db905a2d7569ff3316e"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.586676 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5343ae74f653d6e5cfdb8d86c23ae191018f3034953db905a2d7569ff3316e" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.586729 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28cfm" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.588072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hnc9f" event={"ID":"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca","Type":"ContainerStarted","Data":"e841319fb482c895ce7406c93f46dab7d9a57025a6650e3cd6bc3cccf7dfa458"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.589825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-znbjn" event={"ID":"7a46d626-29c2-42a4-88a0-e01284c086fa","Type":"ContainerDied","Data":"38821881f88cd687a7c774e8c25d26159688054996d2d5a13c14f80184025f23"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.589848 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38821881f88cd687a7c774e8c25d26159688054996d2d5a13c14f80184025f23" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.589888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-znbjn" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.594238 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7h7z" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.594276 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7h7z" event={"ID":"f10d407a-c50c-4f3e-955a-92b2f75d2fd6","Type":"ContainerDied","Data":"e16ed2814ebc59f5a3015b84ef76183ecf14d082bf79020ca548d4bde5e398f2"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.594330 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e16ed2814ebc59f5a3015b84ef76183ecf14d082bf79020ca548d4bde5e398f2" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.596304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d1ca-account-create-update-wkcbb" event={"ID":"e5f02541-9d51-424d-b558-15bb417ad5b2","Type":"ContainerDied","Data":"954c3b8aca5c73e6fc65358c1987df2a9c7e6e10fea06c8292a84568e4b72a5b"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.596342 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d1ca-account-create-update-wkcbb" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.596345 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="954c3b8aca5c73e6fc65358c1987df2a9c7e6e10fea06c8292a84568e4b72a5b" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.598015 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e661-account-create-update-kxqpp" event={"ID":"722c192f-3110-4799-a25c-def078351bbc","Type":"ContainerDied","Data":"45a84a0b3dd440350109f96cc5b3852cd8bec579ab4fee8df8dd027b4ba953b9"} Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.598045 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a84a0b3dd440350109f96cc5b3852cd8bec579ab4fee8df8dd027b4ba953b9" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.598033 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e661-account-create-update-kxqpp" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.623637 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hnc9f" podStartSLOduration=3.873469256 podStartE2EDuration="10.62361923s" podCreationTimestamp="2026-03-17 00:44:02 +0000 UTC" firstStartedPulling="2026-03-17 00:44:04.971657597 +0000 UTC m=+1319.731109870" lastFinishedPulling="2026-03-17 00:44:11.721807521 +0000 UTC m=+1326.481259844" observedRunningTime="2026-03-17 00:44:12.616241697 +0000 UTC m=+1327.375693980" watchObservedRunningTime="2026-03-17 00:44:12.62361923 +0000 UTC m=+1327.383071523" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.631588 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.717823 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.718387 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-m5788" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="dnsmasq-dns" containerID="cri-o://0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471" gracePeriod=10 Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.977909 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561798-htkqk"] Mar 17 00:44:12 crc kubenswrapper[4755]: I0317 00:44:12.995524 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561798-htkqk"] Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.197150 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dvvpc-config-8p2rk"] Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.205503 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dvvpc-config-8p2rk"] Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.279540 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.393489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdsdf\" (UniqueName: \"kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf\") pod \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.393551 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb\") pod \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.393585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config\") pod \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.393666 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc\") pod \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.393721 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb\") pod \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\" (UID: \"2cd96a96-5f56-4b8a-a198-8d7ad6b81018\") " Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.410627 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf" (OuterVolumeSpecName: "kube-api-access-zdsdf") pod "2cd96a96-5f56-4b8a-a198-8d7ad6b81018" (UID: "2cd96a96-5f56-4b8a-a198-8d7ad6b81018"). InnerVolumeSpecName "kube-api-access-zdsdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.448094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config" (OuterVolumeSpecName: "config") pod "2cd96a96-5f56-4b8a-a198-8d7ad6b81018" (UID: "2cd96a96-5f56-4b8a-a198-8d7ad6b81018"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.487925 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2cd96a96-5f56-4b8a-a198-8d7ad6b81018" (UID: "2cd96a96-5f56-4b8a-a198-8d7ad6b81018"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.498403 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdsdf\" (UniqueName: \"kubernetes.io/projected/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-kube-api-access-zdsdf\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.498625 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.498683 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.519908 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2cd96a96-5f56-4b8a-a198-8d7ad6b81018" (UID: "2cd96a96-5f56-4b8a-a198-8d7ad6b81018"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.570886 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2cd96a96-5f56-4b8a-a198-8d7ad6b81018" (UID: "2cd96a96-5f56-4b8a-a198-8d7ad6b81018"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.599788 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.599818 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd96a96-5f56-4b8a-a198-8d7ad6b81018-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.608058 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerID="0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471" exitCode=0 Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.608127 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-m5788" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.608144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-m5788" event={"ID":"2cd96a96-5f56-4b8a-a198-8d7ad6b81018","Type":"ContainerDied","Data":"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471"} Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.608189 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-m5788" event={"ID":"2cd96a96-5f56-4b8a-a198-8d7ad6b81018","Type":"ContainerDied","Data":"aef963831347f5acb50e6bcc64371f90ea725f7dd4bb90b68d19a60fc2849f29"} Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.608206 4755 scope.go:117] "RemoveContainer" containerID="0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.611026 4755 generic.go:334] "Generic (PLEG): container finished" podID="0991527c-bb4b-498c-86b3-d303cee4eeb1" containerID="5c2ca20c1adbcd47f53c61b151002dd6fcfa14ce84372a1e1c2c02eb52aead30" exitCode=0 Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.611489 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjbjp" event={"ID":"0991527c-bb4b-498c-86b3-d303cee4eeb1","Type":"ContainerDied","Data":"5c2ca20c1adbcd47f53c61b151002dd6fcfa14ce84372a1e1c2c02eb52aead30"} Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.694741 4755 scope.go:117] "RemoveContainer" containerID="854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.698546 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.705450 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-m5788"] Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.715760 4755 scope.go:117] "RemoveContainer" containerID="0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471" Mar 17 00:44:13 crc kubenswrapper[4755]: E0317 00:44:13.716249 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471\": container with ID starting with 0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471 not found: ID does not exist" containerID="0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.716310 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471"} err="failed to get container status \"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471\": rpc error: code = NotFound desc = could not find container \"0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471\": container with ID starting with 0397474b9cd3a6c94041118cb8807a1a81d17e0cab0ac50a0acfa7ea81b1c471 not found: ID does not exist" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.716333 4755 scope.go:117] "RemoveContainer" containerID="854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6" Mar 17 00:44:13 crc kubenswrapper[4755]: E0317 00:44:13.716613 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6\": container with ID starting with 854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6 not found: ID does not exist" containerID="854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6" Mar 17 00:44:13 crc kubenswrapper[4755]: I0317 00:44:13.716657 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6"} err="failed to get container status \"854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6\": rpc error: code = NotFound desc = could not find container \"854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6\": container with ID starting with 854a13dc0243c1a36144f4cd7a164e8362b87da4201abf7e704c7364339a85a6 not found: ID does not exist" Mar 17 00:44:14 crc kubenswrapper[4755]: I0317 00:44:14.263553 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" path="/var/lib/kubelet/pods/2cd96a96-5f56-4b8a-a198-8d7ad6b81018/volumes" Mar 17 00:44:14 crc kubenswrapper[4755]: I0317 00:44:14.265888 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aaedc1e-542e-4f33-acf6-a5de25bdedef" path="/var/lib/kubelet/pods/5aaedc1e-542e-4f33-acf6-a5de25bdedef/volumes" Mar 17 00:44:14 crc kubenswrapper[4755]: I0317 00:44:14.267544 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daff8a0c-71c2-4d19-b90d-98663b80ba85" path="/var/lib/kubelet/pods/daff8a0c-71c2-4d19-b90d-98663b80ba85/volumes" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.133501 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjbjp" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.227595 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle\") pod \"0991527c-bb4b-498c-86b3-d303cee4eeb1\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.227876 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data\") pod \"0991527c-bb4b-498c-86b3-d303cee4eeb1\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.227925 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data\") pod \"0991527c-bb4b-498c-86b3-d303cee4eeb1\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.228024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp2kv\" (UniqueName: \"kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv\") pod \"0991527c-bb4b-498c-86b3-d303cee4eeb1\" (UID: \"0991527c-bb4b-498c-86b3-d303cee4eeb1\") " Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.233332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0991527c-bb4b-498c-86b3-d303cee4eeb1" (UID: "0991527c-bb4b-498c-86b3-d303cee4eeb1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.233632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv" (OuterVolumeSpecName: "kube-api-access-wp2kv") pod "0991527c-bb4b-498c-86b3-d303cee4eeb1" (UID: "0991527c-bb4b-498c-86b3-d303cee4eeb1"). InnerVolumeSpecName "kube-api-access-wp2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.260653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0991527c-bb4b-498c-86b3-d303cee4eeb1" (UID: "0991527c-bb4b-498c-86b3-d303cee4eeb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.299069 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data" (OuterVolumeSpecName: "config-data") pod "0991527c-bb4b-498c-86b3-d303cee4eeb1" (UID: "0991527c-bb4b-498c-86b3-d303cee4eeb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.329812 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.329846 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.329856 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991527c-bb4b-498c-86b3-d303cee4eeb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.329864 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp2kv\" (UniqueName: \"kubernetes.io/projected/0991527c-bb4b-498c-86b3-d303cee4eeb1-kube-api-access-wp2kv\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.634980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjbjp" event={"ID":"0991527c-bb4b-498c-86b3-d303cee4eeb1","Type":"ContainerDied","Data":"f3e23c5a374c4247f2fbf603d289192f5fe30af07b2adf5eb318790b081b9b11"} Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.635207 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e23c5a374c4247f2fbf603d289192f5fe30af07b2adf5eb318790b081b9b11" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.635026 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjbjp" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.637505 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" containerID="e841319fb482c895ce7406c93f46dab7d9a57025a6650e3cd6bc3cccf7dfa458" exitCode=0 Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.637628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hnc9f" event={"ID":"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca","Type":"ContainerDied","Data":"e841319fb482c895ce7406c93f46dab7d9a57025a6650e3cd6bc3cccf7dfa458"} Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.639627 4755 generic.go:334] "Generic (PLEG): container finished" podID="c1ebdcce-406b-4668-a325-f1f4318b2d69" containerID="fe99e945fd2ee0f21f2f8df752e4c1e55b91389f967d077cee1d3d8978ba28da" exitCode=0 Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.639666 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerDied","Data":"fe99e945fd2ee0f21f2f8df752e4c1e55b91389f967d077cee1d3d8978ba28da"} Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.962797 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963414 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0991527c-bb4b-498c-86b3-d303cee4eeb1" containerName="glance-db-sync" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963426 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0991527c-bb4b-498c-86b3-d303cee4eeb1" containerName="glance-db-sync" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963449 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab11e25f-07a9-431a-bdbd-bafb6d673e5c" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963455 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab11e25f-07a9-431a-bdbd-bafb6d673e5c" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963466 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a46d626-29c2-42a4-88a0-e01284c086fa" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963472 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a46d626-29c2-42a4-88a0-e01284c086fa" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963485 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="init" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963490 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="init" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963496 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10d407a-c50c-4f3e-955a-92b2f75d2fd6" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963503 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10d407a-c50c-4f3e-955a-92b2f75d2fd6" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963518 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722c192f-3110-4799-a25c-def078351bbc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963525 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="722c192f-3110-4799-a25c-def078351bbc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963537 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f02541-9d51-424d-b558-15bb417ad5b2" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f02541-9d51-424d-b558-15bb417ad5b2" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963557 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="dnsmasq-dns" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963564 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="dnsmasq-dns" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963572 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa0451d-3211-4eb1-86ab-ca7a573632fc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963577 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa0451d-3211-4eb1-86ab-ca7a573632fc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963587 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" containerName="oc" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963593 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" containerName="oc" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963606 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5673ab-42bb-4268-a723-b2df9c13904b" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963612 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5673ab-42bb-4268-a723-b2df9c13904b" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963623 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daff8a0c-71c2-4d19-b90d-98663b80ba85" containerName="ovn-config" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963628 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="daff8a0c-71c2-4d19-b90d-98663b80ba85" containerName="ovn-config" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963641 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12fba3ab-a03b-40ab-8ed5-ce2b667003da" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963647 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12fba3ab-a03b-40ab-8ed5-ce2b667003da" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: E0317 00:44:15.963656 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48efdbc2-4211-40f2-8f38-c7e2199852ba" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963662 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48efdbc2-4211-40f2-8f38-c7e2199852ba" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963846 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="722c192f-3110-4799-a25c-def078351bbc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963857 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5673ab-42bb-4268-a723-b2df9c13904b" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963870 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd96a96-5f56-4b8a-a198-8d7ad6b81018" containerName="dnsmasq-dns" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963879 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12fba3ab-a03b-40ab-8ed5-ce2b667003da" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963888 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48efdbc2-4211-40f2-8f38-c7e2199852ba" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963895 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0991527c-bb4b-498c-86b3-d303cee4eeb1" containerName="glance-db-sync" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963902 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa0451d-3211-4eb1-86ab-ca7a573632fc" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963909 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="daff8a0c-71c2-4d19-b90d-98663b80ba85" containerName="ovn-config" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963916 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10d407a-c50c-4f3e-955a-92b2f75d2fd6" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963925 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" containerName="oc" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963935 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f02541-9d51-424d-b558-15bb417ad5b2" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963945 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a46d626-29c2-42a4-88a0-e01284c086fa" containerName="mariadb-account-create-update" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.963957 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab11e25f-07a9-431a-bdbd-bafb6d673e5c" containerName="mariadb-database-create" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.964967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:15 crc kubenswrapper[4755]: I0317 00:44:15.984849 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.151764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.153235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.153386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.153623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.153779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.153977 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdmp\" (UniqueName: \"kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.255149 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.255226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.255277 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdmp\" (UniqueName: \"kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.255295 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.255340 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.256821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.282615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdmp\" (UniqueName: \"kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp\") pod \"dnsmasq-dns-7ff5475cc9-652qb\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.313681 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.650665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerStarted","Data":"a477261dd3ac0a281f0aab862b7caf6b88d97a861634127285102f2470e5b1a0"} Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.898745 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:16 crc kubenswrapper[4755]: W0317 00:44:16.910215 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2092b5ef_18db_44ce_b8f4_e642cb6c5ae9.slice/crio-d30f866f3842eef0c73a2caef2c1a2b57cb803815ce2575dbc2e9dc9188f8210 WatchSource:0}: Error finding container d30f866f3842eef0c73a2caef2c1a2b57cb803815ce2575dbc2e9dc9188f8210: Status 404 returned error can't find the container with id d30f866f3842eef0c73a2caef2c1a2b57cb803815ce2575dbc2e9dc9188f8210 Mar 17 00:44:16 crc kubenswrapper[4755]: I0317 00:44:16.977031 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.071235 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdsq\" (UniqueName: \"kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq\") pod \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.071289 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle\") pod \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.071512 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data\") pod \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\" (UID: \"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca\") " Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.077734 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq" (OuterVolumeSpecName: "kube-api-access-9hdsq") pod "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" (UID: "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca"). InnerVolumeSpecName "kube-api-access-9hdsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.097566 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" (UID: "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.119583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data" (OuterVolumeSpecName: "config-data") pod "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" (UID: "ca2dc7f6-f91c-4e3c-a360-a464608fd8ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.173582 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.173610 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdsq\" (UniqueName: \"kubernetes.io/projected/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-kube-api-access-9hdsq\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.173622 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:17 crc kubenswrapper[4755]: E0317 00:44:17.461202 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2092b5ef_18db_44ce_b8f4_e642cb6c5ae9.slice/crio-ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2092b5ef_18db_44ce_b8f4_e642cb6c5ae9.slice/crio-conmon-ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.659420 4755 generic.go:334] "Generic (PLEG): container finished" podID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerID="ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3" exitCode=0 Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.659533 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" event={"ID":"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9","Type":"ContainerDied","Data":"ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3"} Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.659559 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" event={"ID":"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9","Type":"ContainerStarted","Data":"d30f866f3842eef0c73a2caef2c1a2b57cb803815ce2575dbc2e9dc9188f8210"} Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.660972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hnc9f" event={"ID":"ca2dc7f6-f91c-4e3c-a360-a464608fd8ca","Type":"ContainerDied","Data":"de620c321b79dc5527e8c782ece2d43c2c3b937f1be69b8ccb1061c43f100a05"} Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.660991 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de620c321b79dc5527e8c782ece2d43c2c3b937f1be69b8ccb1061c43f100a05" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.661078 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hnc9f" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.915251 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.944528 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:17 crc kubenswrapper[4755]: E0317 00:44:17.944981 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" containerName="keystone-db-sync" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.944998 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" containerName="keystone-db-sync" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.945200 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" containerName="keystone-db-sync" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.946240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:17 crc kubenswrapper[4755]: I0317 00:44:17.990931 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.024206 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jp67d"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.029159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.035122 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.035697 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.035704 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8cwbq" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.037621 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jp67d"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.038954 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.039039 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104485 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104562 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.104714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p6xh\" (UniqueName: \"kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.136380 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-gdjsh"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.138216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.141837 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-srx5n" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.141997 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.170782 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-gdjsh"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.215981 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216053 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216196 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216261 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjd9\" (UniqueName: \"kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.216321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p6xh\" (UniqueName: \"kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.217316 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.218076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.218806 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.219554 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.220272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.254752 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p6xh\" (UniqueName: \"kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh\") pod \"dnsmasq-dns-5c5cc7c5ff-2x8kg\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.267200 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-v7jsc"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.268789 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.286468 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.286688 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nlshq" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.287184 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4xd8r"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.288476 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.301138 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.318486 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v7jsc"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.319667 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.319674 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.319904 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320047 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjd9\" (UniqueName: \"kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5t76\" (UniqueName: \"kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.320323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.326121 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2hp9f" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.326326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.334267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.335079 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.337852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.338553 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.344525 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4xd8r"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.370222 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.376069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjd9\" (UniqueName: \"kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9\") pod \"keystone-bootstrap-jp67d\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.392996 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xksp4"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.394298 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.399376 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ll7r8" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.399858 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.409599 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xksp4"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqx68\" (UniqueName: \"kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423602 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423652 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.423917 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkf46\" (UniqueName: \"kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.424107 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.424168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.424205 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.424307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.424405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5t76\" (UniqueName: \"kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.436951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.471875 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.492151 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525881 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkf46\" (UniqueName: \"kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.525976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.526006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld7pz\" (UniqueName: \"kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.526036 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.526058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.526090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.526129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqx68\" (UniqueName: \"kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.527304 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bdnb6"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.528899 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.530822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.553042 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.553241 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8chfh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.553932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.554414 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.554664 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.559210 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.564641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5t76\" (UniqueName: \"kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76\") pod \"heat-db-sync-gdjsh\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.564713 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.586103 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bdnb6"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.591107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkf46\" (UniqueName: \"kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46\") pod \"neutron-db-sync-4xd8r\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.599366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqx68\" (UniqueName: \"kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.631112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.631208 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.631770 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.631952 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle\") pod \"cinder-db-sync-v7jsc\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.632757 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.632808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.633955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634049 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s82d\" (UniqueName: \"kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634117 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld7pz\" (UniqueName: \"kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.634491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.646451 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.657239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.662263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.678621 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld7pz\" (UniqueName: \"kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz\") pod \"barbican-db-sync-xksp4\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.740895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" event={"ID":"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9","Type":"ContainerStarted","Data":"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec"} Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.741301 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="dnsmasq-dns" containerID="cri-o://084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec" gracePeriod=10 Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.741780 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743266 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6ks\" (UniqueName: \"kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743521 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743558 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.743639 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s82d\" (UniqueName: \"kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.745599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.747599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.749949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.755653 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.770965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s82d\" (UniqueName: \"kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d\") pod \"placement-db-sync-bdnb6\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.787115 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" podStartSLOduration=3.7870930940000003 podStartE2EDuration="3.787093094s" podCreationTimestamp="2026-03-17 00:44:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:18.783848239 +0000 UTC m=+1333.543300532" watchObservedRunningTime="2026-03-17 00:44:18.787093094 +0000 UTC m=+1333.546545377" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.820760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gdjsh" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.831550 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.844801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.844997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.845090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb6ks\" (UniqueName: \"kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.845188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.845261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.845336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.846373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.847041 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.847637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.847861 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.848788 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.887946 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb6ks\" (UniqueName: \"kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks\") pod \"dnsmasq-dns-8b5c85b87-82q9v\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.896393 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.917259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xksp4" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.934380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bdnb6" Mar 17 00:44:18 crc kubenswrapper[4755]: I0317 00:44:18.941339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.020196 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.023434 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.027914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.028135 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.034307 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9wr\" (UniqueName: \"kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069665 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.069991 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175551 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9wr\" (UniqueName: \"kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175732 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.175807 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.176300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.177172 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.182050 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.182161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.185159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.197028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9wr\" (UniqueName: \"kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.214324 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.215605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data\") pod \"ceilometer-0\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.358952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jp67d"] Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.369181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.573454 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586116 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586254 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcdmp\" (UniqueName: \"kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586272 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.586316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb\") pod \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\" (UID: \"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9\") " Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.598466 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp" (OuterVolumeSpecName: "kube-api-access-hcdmp") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "kube-api-access-hcdmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.655269 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.670346 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.686560 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.687997 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.688022 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.688033 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcdmp\" (UniqueName: \"kubernetes.io/projected/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-kube-api-access-hcdmp\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.688042 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.722144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.733561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config" (OuterVolumeSpecName: "config") pod "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" (UID: "2092b5ef-18db-44ce-b8f4-e642cb6c5ae9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.750401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"42a2a1da-3480-4f1d-bba8-a725657e9fcd","Type":"ContainerStarted","Data":"e8a560b3af01b330cccfc8d71e871639271e3c56d0bd1e7ee86c8d821f29d807"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.752775 4755 generic.go:334] "Generic (PLEG): container finished" podID="23f06fa2-6b51-4ae8-b8a4-967a920fed1a" containerID="7d18850943f1dea7a5412e03b1e2825b651383de89f5bc3719e4574fd66c73a1" exitCode=0 Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.752838 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" event={"ID":"23f06fa2-6b51-4ae8-b8a4-967a920fed1a","Type":"ContainerDied","Data":"7d18850943f1dea7a5412e03b1e2825b651383de89f5bc3719e4574fd66c73a1"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.752863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" event={"ID":"23f06fa2-6b51-4ae8-b8a4-967a920fed1a","Type":"ContainerStarted","Data":"f641ce2e49e549386d104b49c9d221709285b3ffa1ab1bf53a9ea97f51042a6d"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.784648 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.163267348 podStartE2EDuration="28.784627857s" podCreationTimestamp="2026-03-17 00:43:51 +0000 UTC" firstStartedPulling="2026-03-17 00:43:52.364017755 +0000 UTC m=+1307.123470038" lastFinishedPulling="2026-03-17 00:44:17.985378264 +0000 UTC m=+1332.744830547" observedRunningTime="2026-03-17 00:44:19.775842386 +0000 UTC m=+1334.535294669" watchObservedRunningTime="2026-03-17 00:44:19.784627857 +0000 UTC m=+1334.544080140" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.785562 4755 generic.go:334] "Generic (PLEG): container finished" podID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerID="084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec" exitCode=0 Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.785629 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" event={"ID":"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9","Type":"ContainerDied","Data":"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.785674 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" event={"ID":"2092b5ef-18db-44ce-b8f4-e642cb6c5ae9","Type":"ContainerDied","Data":"d30f866f3842eef0c73a2caef2c1a2b57cb803815ce2575dbc2e9dc9188f8210"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.785692 4755 scope.go:117] "RemoveContainer" containerID="084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.785783 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-652qb" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.789866 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.789896 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.805392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jp67d" event={"ID":"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5","Type":"ContainerStarted","Data":"08b6bb7a007c080b90071dcb45e28cc4462af50803fba3119c51b010bd64c444"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.817269 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerStarted","Data":"2da07f9684967d5fe023313cf7403008e2fddbec46f4ac9df371534e451a74f6"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.817311 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c1ebdcce-406b-4668-a325-f1f4318b2d69","Type":"ContainerStarted","Data":"2e65008893f7803dde60370c6bda3151a89d29838892788f11d407829bf60745"} Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.826108 4755 scope.go:117] "RemoveContainer" containerID="ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.841104 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jp67d" podStartSLOduration=2.84108239 podStartE2EDuration="2.84108239s" podCreationTimestamp="2026-03-17 00:44:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:19.825762848 +0000 UTC m=+1334.585215131" watchObservedRunningTime="2026-03-17 00:44:19.84108239 +0000 UTC m=+1334.600534673" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.889714 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.889692428 podStartE2EDuration="14.889692428s" podCreationTimestamp="2026-03-17 00:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:19.870219626 +0000 UTC m=+1334.629671899" watchObservedRunningTime="2026-03-17 00:44:19.889692428 +0000 UTC m=+1334.649144711" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.892269 4755 scope.go:117] "RemoveContainer" containerID="084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec" Mar 17 00:44:19 crc kubenswrapper[4755]: E0317 00:44:19.894182 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec\": container with ID starting with 084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec not found: ID does not exist" containerID="084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.894217 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec"} err="failed to get container status \"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec\": rpc error: code = NotFound desc = could not find container \"084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec\": container with ID starting with 084a197b2db993b338223a1c51983cf24773751e13f663e65f1aa7e2abfc3bec not found: ID does not exist" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.894261 4755 scope.go:117] "RemoveContainer" containerID="ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3" Mar 17 00:44:19 crc kubenswrapper[4755]: E0317 00:44:19.897344 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3\": container with ID starting with ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3 not found: ID does not exist" containerID="ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.897372 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3"} err="failed to get container status \"ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3\": rpc error: code = NotFound desc = could not find container \"ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3\": container with ID starting with ec1e25950e5cde93f7822c5d45eb00853358a6f8f3fc01c3675384df9af4cbb3 not found: ID does not exist" Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.917595 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:19 crc kubenswrapper[4755]: I0317 00:44:19.924667 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-652qb"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.015816 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xksp4"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.031178 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.042177 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4xd8r"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.057266 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-gdjsh"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.070628 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-v7jsc"] Mar 17 00:44:20 crc kubenswrapper[4755]: W0317 00:44:20.072598 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2315f493_9035_4185_b615_e7eed6a246ea.slice/crio-06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5 WatchSource:0}: Error finding container 06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5: Status 404 returned error can't find the container with id 06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5 Mar 17 00:44:20 crc kubenswrapper[4755]: W0317 00:44:20.072901 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod404d7b5a_9c59_4c63_b3be_740554b83374.slice/crio-268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274 WatchSource:0}: Error finding container 268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274: Status 404 returned error can't find the container with id 268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274 Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.145045 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bdnb6"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.283417 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" path="/var/lib/kubelet/pods/2092b5ef-18db-44ce-b8f4-e642cb6c5ae9/volumes" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.284221 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.410389 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513252 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513546 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513594 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513736 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513773 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.513805 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p6xh\" (UniqueName: \"kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh\") pod \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\" (UID: \"23f06fa2-6b51-4ae8-b8a4-967a920fed1a\") " Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.590103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.590147 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.590227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config" (OuterVolumeSpecName: "config") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.590282 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh" (OuterVolumeSpecName: "kube-api-access-5p6xh") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "kube-api-access-5p6xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.590342 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.592059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23f06fa2-6b51-4ae8-b8a4-967a920fed1a" (UID: "23f06fa2-6b51-4ae8-b8a4-967a920fed1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616309 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616347 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616361 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616371 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616383 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.616396 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p6xh\" (UniqueName: \"kubernetes.io/projected/23f06fa2-6b51-4ae8-b8a4-967a920fed1a-kube-api-access-5p6xh\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.832371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gdjsh" event={"ID":"2315f493-9035-4185-b615-e7eed6a246ea","Type":"ContainerStarted","Data":"06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.834450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerStarted","Data":"841268a98b5c8a03ce02bf4f62f9ed61f1d9d6b616eb7360d12d76e9017b0a91"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.837345 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v7jsc" event={"ID":"404d7b5a-9c59-4c63-b3be-740554b83374","Type":"ContainerStarted","Data":"268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.841125 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4xd8r" event={"ID":"24bf6c90-0673-42ed-b463-d0510425117d","Type":"ContainerStarted","Data":"48ac9b2cd5f688088f4bf4c0c38a31fb4e48e4d147b948f0e0c6967341c23560"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.841161 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4xd8r" event={"ID":"24bf6c90-0673-42ed-b463-d0510425117d","Type":"ContainerStarted","Data":"d44f49e80987b7100031208e9a9be6a968b28309fab1a77d90553aa96b57f47b"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.845266 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bdnb6" event={"ID":"bbb8013b-627b-4894-945e-178871516870","Type":"ContainerStarted","Data":"5f7d35f68a57b286a1d5ae442c28510b553469eb3980045507f9966883091541"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.867115 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1684140-79c2-4f36-9755-7127141107ee" containerID="f35b6cab4b5eb56e0e5e5a52eb208f5ac9f82ca8071a9ae07422d623a4ab5712" exitCode=0 Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.867323 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" event={"ID":"d1684140-79c2-4f36-9755-7127141107ee","Type":"ContainerDied","Data":"f35b6cab4b5eb56e0e5e5a52eb208f5ac9f82ca8071a9ae07422d623a4ab5712"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.867381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" event={"ID":"d1684140-79c2-4f36-9755-7127141107ee","Type":"ContainerStarted","Data":"7062e66d8277c94d137d5e13f22df99f1ee5f726ae527c793909f794994ea37f"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.871042 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4xd8r" podStartSLOduration=2.8710275750000003 podStartE2EDuration="2.871027575s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:20.861304009 +0000 UTC m=+1335.620756292" watchObservedRunningTime="2026-03-17 00:44:20.871027575 +0000 UTC m=+1335.630479858" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.883064 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jp67d" event={"ID":"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5","Type":"ContainerStarted","Data":"eb9c88561ae24687302fdb4089df66605538ffbc96a8570b9ac9bd2667fec2ee"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.902834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" event={"ID":"23f06fa2-6b51-4ae8-b8a4-967a920fed1a","Type":"ContainerDied","Data":"f641ce2e49e549386d104b49c9d221709285b3ffa1ab1bf53a9ea97f51042a6d"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.902882 4755 scope.go:117] "RemoveContainer" containerID="7d18850943f1dea7a5412e03b1e2825b651383de89f5bc3719e4574fd66c73a1" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.902946 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg" Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.913454 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xksp4" event={"ID":"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f","Type":"ContainerStarted","Data":"d31022857ea381fe449069f0192303d656d712ce81a8abb8cfe64bdeca6d9ef4"} Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.979312 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:20 crc kubenswrapper[4755]: I0317 00:44:20.991567 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2x8kg"] Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.072931 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.072967 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.084805 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.742769 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.940893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" event={"ID":"d1684140-79c2-4f36-9755-7127141107ee","Type":"ContainerStarted","Data":"78243512a1cf8d78e7a81e1b1b7902bad2f234d3b62d75a925ff890636b4fa09"} Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.941036 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.967933 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 17 00:44:21 crc kubenswrapper[4755]: I0317 00:44:21.968542 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" podStartSLOduration=3.968523594 podStartE2EDuration="3.968523594s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:21.966327727 +0000 UTC m=+1336.725780010" watchObservedRunningTime="2026-03-17 00:44:21.968523594 +0000 UTC m=+1336.727975887" Mar 17 00:44:22 crc kubenswrapper[4755]: I0317 00:44:22.269325 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f06fa2-6b51-4ae8-b8a4-967a920fed1a" path="/var/lib/kubelet/pods/23f06fa2-6b51-4ae8-b8a4-967a920fed1a/volumes" Mar 17 00:44:23 crc kubenswrapper[4755]: I0317 00:44:23.999634 4755 generic.go:334] "Generic (PLEG): container finished" podID="69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" containerID="eb9c88561ae24687302fdb4089df66605538ffbc96a8570b9ac9bd2667fec2ee" exitCode=0 Mar 17 00:44:23 crc kubenswrapper[4755]: I0317 00:44:23.999730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jp67d" event={"ID":"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5","Type":"ContainerDied","Data":"eb9c88561ae24687302fdb4089df66605538ffbc96a8570b9ac9bd2667fec2ee"} Mar 17 00:44:24 crc kubenswrapper[4755]: I0317 00:44:24.981541 4755 scope.go:117] "RemoveContainer" containerID="c026cc0fcb16c816bcd27db3d03977063a39b40bdead9e159d5e0f9d731a67cb" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.808063 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.919094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.919166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.925206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.928495 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnjd9\" (UniqueName: \"kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.928786 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.928930 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.930064 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data\") pod \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\" (UID: \"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5\") " Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.935063 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.935423 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9" (OuterVolumeSpecName: "kube-api-access-fnjd9") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "kube-api-access-fnjd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.941785 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.941817 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnjd9\" (UniqueName: \"kubernetes.io/projected/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-kube-api-access-fnjd9\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.941828 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.942331 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts" (OuterVolumeSpecName: "scripts") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.973548 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:26 crc kubenswrapper[4755]: I0317 00:44:26.988271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data" (OuterVolumeSpecName: "config-data") pod "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" (UID: "69fb04fb-3dd0-4589-b46d-a34a1f0a19b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.036729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jp67d" event={"ID":"69fb04fb-3dd0-4589-b46d-a34a1f0a19b5","Type":"ContainerDied","Data":"08b6bb7a007c080b90071dcb45e28cc4462af50803fba3119c51b010bd64c444"} Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.036776 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08b6bb7a007c080b90071dcb45e28cc4462af50803fba3119c51b010bd64c444" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.036804 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jp67d" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.043576 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.043603 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.043616 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.908121 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jp67d"] Mar 17 00:44:27 crc kubenswrapper[4755]: I0317 00:44:27.918873 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jp67d"] Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001034 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6jk7m"] Mar 17 00:44:28 crc kubenswrapper[4755]: E0317 00:44:28.001449 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="init" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001465 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="init" Mar 17 00:44:28 crc kubenswrapper[4755]: E0317 00:44:28.001477 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f06fa2-6b51-4ae8-b8a4-967a920fed1a" containerName="init" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001482 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f06fa2-6b51-4ae8-b8a4-967a920fed1a" containerName="init" Mar 17 00:44:28 crc kubenswrapper[4755]: E0317 00:44:28.001502 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" containerName="keystone-bootstrap" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001510 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" containerName="keystone-bootstrap" Mar 17 00:44:28 crc kubenswrapper[4755]: E0317 00:44:28.001524 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="dnsmasq-dns" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001529 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="dnsmasq-dns" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001709 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f06fa2-6b51-4ae8-b8a4-967a920fed1a" containerName="init" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001726 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2092b5ef-18db-44ce-b8f4-e642cb6c5ae9" containerName="dnsmasq-dns" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.001741 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" containerName="keystone-bootstrap" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.002322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.005942 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.006166 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8cwbq" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.006309 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.006555 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.009268 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.019058 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6jk7m"] Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166681 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd69k\" (UniqueName: \"kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.166888 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.260066 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fb04fb-3dd0-4589-b46d-a34a1f0a19b5" path="/var/lib/kubelet/pods/69fb04fb-3dd0-4589-b46d-a34a1f0a19b5/volumes" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.268682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.268737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.268803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.269530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.269590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd69k\" (UniqueName: \"kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.269610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.274204 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.274688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.280582 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.285506 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.289078 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd69k\" (UniqueName: \"kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.291604 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys\") pod \"keystone-bootstrap-6jk7m\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.326548 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:44:28 crc kubenswrapper[4755]: I0317 00:44:28.956592 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:44:29 crc kubenswrapper[4755]: I0317 00:44:29.026446 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:44:29 crc kubenswrapper[4755]: I0317 00:44:29.026661 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" containerID="cri-o://4a4d975184340c035241f9956f12740eafe39b9e26831f9a1fc826c3cfe177e4" gracePeriod=10 Mar 17 00:44:30 crc kubenswrapper[4755]: I0317 00:44:30.085816 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerID="4a4d975184340c035241f9956f12740eafe39b9e26831f9a1fc826c3cfe177e4" exitCode=0 Mar 17 00:44:30 crc kubenswrapper[4755]: I0317 00:44:30.085864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" event={"ID":"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd","Type":"ContainerDied","Data":"4a4d975184340c035241f9956f12740eafe39b9e26831f9a1fc826c3cfe177e4"} Mar 17 00:44:32 crc kubenswrapper[4755]: I0317 00:44:32.630920 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Mar 17 00:44:35 crc kubenswrapper[4755]: E0317 00:44:35.933684 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 17 00:44:35 crc kubenswrapper[4755]: E0317 00:44:35.934372 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ld7pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xksp4_openstack(c5b76ade-1f20-43e7-bca3-cc0c70a05d4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:44:35 crc kubenswrapper[4755]: E0317 00:44:35.935920 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xksp4" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" Mar 17 00:44:36 crc kubenswrapper[4755]: E0317 00:44:36.158724 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xksp4" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" Mar 17 00:44:37 crc kubenswrapper[4755]: I0317 00:44:37.169592 4755 generic.go:334] "Generic (PLEG): container finished" podID="24bf6c90-0673-42ed-b463-d0510425117d" containerID="48ac9b2cd5f688088f4bf4c0c38a31fb4e48e4d147b948f0e0c6967341c23560" exitCode=0 Mar 17 00:44:37 crc kubenswrapper[4755]: I0317 00:44:37.169668 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4xd8r" event={"ID":"24bf6c90-0673-42ed-b463-d0510425117d","Type":"ContainerDied","Data":"48ac9b2cd5f688088f4bf4c0c38a31fb4e48e4d147b948f0e0c6967341c23560"} Mar 17 00:44:37 crc kubenswrapper[4755]: E0317 00:44:37.563222 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 17 00:44:37 crc kubenswrapper[4755]: E0317 00:44:37.563423 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8s82d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-bdnb6_openstack(bbb8013b-627b-4894-945e-178871516870): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:44:37 crc kubenswrapper[4755]: E0317 00:44:37.564768 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-bdnb6" podUID="bbb8013b-627b-4894-945e-178871516870" Mar 17 00:44:37 crc kubenswrapper[4755]: I0317 00:44:37.631197 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Mar 17 00:44:38 crc kubenswrapper[4755]: E0317 00:44:38.109689 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 17 00:44:38 crc kubenswrapper[4755]: E0317 00:44:38.110549 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h76hd8hbdh578h559h57fh576h7fh5f5h5fch5bfh566hc5h695h4hcdh9fh58bhddh94h94hf9h595h5b9h66dhb9hbch56bh58dh54bhb8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hr9wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(756a984f-fd52-4215-b64e-ecd7c9f2851e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:44:38 crc kubenswrapper[4755]: E0317 00:44:38.206760 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-bdnb6" podUID="bbb8013b-627b-4894-945e-178871516870" Mar 17 00:44:47 crc kubenswrapper[4755]: I0317 00:44:47.631528 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: i/o timeout" Mar 17 00:44:47 crc kubenswrapper[4755]: I0317 00:44:47.632563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:44:49 crc kubenswrapper[4755]: E0317 00:44:49.245517 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 17 00:44:49 crc kubenswrapper[4755]: E0317 00:44:49.245828 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5t76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-gdjsh_openstack(2315f493-9035-4185-b615-e7eed6a246ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:44:49 crc kubenswrapper[4755]: E0317 00:44:49.247013 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-gdjsh" podUID="2315f493-9035-4185-b615-e7eed6a246ea" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.340579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" event={"ID":"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd","Type":"ContainerDied","Data":"3ca547f19f0a36e76febb96d7b8409532b8aa1fa060803762ae5db872868a23a"} Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.340895 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca547f19f0a36e76febb96d7b8409532b8aa1fa060803762ae5db872868a23a" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.343062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4xd8r" event={"ID":"24bf6c90-0673-42ed-b463-d0510425117d","Type":"ContainerDied","Data":"d44f49e80987b7100031208e9a9be6a968b28309fab1a77d90553aa96b57f47b"} Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.343099 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d44f49e80987b7100031208e9a9be6a968b28309fab1a77d90553aa96b57f47b" Mar 17 00:44:49 crc kubenswrapper[4755]: E0317 00:44:49.346551 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-gdjsh" podUID="2315f493-9035-4185-b615-e7eed6a246ea" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.382402 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.383082 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457483 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457557 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457619 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjrjb\" (UniqueName: \"kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457652 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config\") pod \"24bf6c90-0673-42ed-b463-d0510425117d\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457747 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkf46\" (UniqueName: \"kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46\") pod \"24bf6c90-0673-42ed-b463-d0510425117d\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle\") pod \"24bf6c90-0673-42ed-b463-d0510425117d\" (UID: \"24bf6c90-0673-42ed-b463-d0510425117d\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.457895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc\") pod \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\" (UID: \"aa4b3ea9-fa93-4bac-bcd3-76686701dbbd\") " Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.495831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46" (OuterVolumeSpecName: "kube-api-access-jkf46") pod "24bf6c90-0673-42ed-b463-d0510425117d" (UID: "24bf6c90-0673-42ed-b463-d0510425117d"). InnerVolumeSpecName "kube-api-access-jkf46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.504578 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb" (OuterVolumeSpecName: "kube-api-access-rjrjb") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "kube-api-access-rjrjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.560305 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjrjb\" (UniqueName: \"kubernetes.io/projected/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-kube-api-access-rjrjb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.560335 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkf46\" (UniqueName: \"kubernetes.io/projected/24bf6c90-0673-42ed-b463-d0510425117d-kube-api-access-jkf46\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.580460 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24bf6c90-0673-42ed-b463-d0510425117d" (UID: "24bf6c90-0673-42ed-b463-d0510425117d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.603096 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.604532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.618572 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config" (OuterVolumeSpecName: "config") pod "24bf6c90-0673-42ed-b463-d0510425117d" (UID: "24bf6c90-0673-42ed-b463-d0510425117d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.622002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.640410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config" (OuterVolumeSpecName: "config") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.641350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" (UID: "aa4b3ea9-fa93-4bac-bcd3-76686701dbbd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662373 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662408 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662418 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bf6c90-0673-42ed-b463-d0510425117d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662429 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662454 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662462 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:49 crc kubenswrapper[4755]: I0317 00:44:49.662472 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.356735 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4xd8r" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.356669 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.406777 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.416799 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-kpgvs"] Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.687940 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:44:50 crc kubenswrapper[4755]: E0317 00:44:50.688625 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bf6c90-0673-42ed-b463-d0510425117d" containerName="neutron-db-sync" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.688643 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bf6c90-0673-42ed-b463-d0510425117d" containerName="neutron-db-sync" Mar 17 00:44:50 crc kubenswrapper[4755]: E0317 00:44:50.688663 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="init" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.688672 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="init" Mar 17 00:44:50 crc kubenswrapper[4755]: E0317 00:44:50.688684 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.688691 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.688876 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.688887 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bf6c90-0673-42ed-b463-d0510425117d" containerName="neutron-db-sync" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.689876 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.705264 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784135 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlfnh\" (UniqueName: \"kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.784367 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.819088 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.821896 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.832249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.832814 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2hp9f" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.833260 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.834017 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.834156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886685 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886861 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4htr\" (UniqueName: \"kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.886966 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlfnh\" (UniqueName: \"kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.887008 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.887041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.888409 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.888529 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.888690 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.889275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.889321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.922268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlfnh\" (UniqueName: \"kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh\") pod \"dnsmasq-dns-84b966f6c9-ddx2p\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.988564 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.988619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4htr\" (UniqueName: \"kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.988697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.988758 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.988804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.994321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.995882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:50 crc kubenswrapper[4755]: I0317 00:44:50.996066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.000618 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.008991 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4htr\" (UniqueName: \"kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr\") pod \"neutron-df477f8d4-2cfqn\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.034350 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.153825 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:51 crc kubenswrapper[4755]: E0317 00:44:51.181583 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 17 00:44:51 crc kubenswrapper[4755]: E0317 00:44:51.181755 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqx68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-v7jsc_openstack(404d7b5a-9c59-4c63-b3be-740554b83374): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:44:51 crc kubenswrapper[4755]: E0317 00:44:51.183021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-v7jsc" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" Mar 17 00:44:51 crc kubenswrapper[4755]: E0317 00:44:51.383806 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-v7jsc" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.621956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6jk7m"] Mar 17 00:44:51 crc kubenswrapper[4755]: I0317 00:44:51.770805 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.260515 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" path="/var/lib/kubelet/pods/aa4b3ea9-fa93-4bac-bcd3-76686701dbbd/volumes" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.633190 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-kpgvs" podUID="aa4b3ea9-fa93-4bac-bcd3-76686701dbbd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: i/o timeout" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.834386 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.836365 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.838549 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.841101 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.845179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.931845 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.931894 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.932153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.932256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnfg\" (UniqueName: \"kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.932305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.932326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: I0317 00:44:52.932470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:52 crc kubenswrapper[4755]: W0317 00:44:52.960764 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba263b61_80d7_4745_9bc1_e190a16d59b3.slice/crio-85cbd4bcb247893eb0af3732d16187e313a00fe300aa6bc054ea07d48613eae8 WatchSource:0}: Error finding container 85cbd4bcb247893eb0af3732d16187e313a00fe300aa6bc054ea07d48613eae8: Status 404 returned error can't find the container with id 85cbd4bcb247893eb0af3732d16187e313a00fe300aa6bc054ea07d48613eae8 Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034814 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnfg\" (UniqueName: \"kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.034849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.039751 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.051361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.051824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.052508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.052985 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.053408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.061157 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnfg\" (UniqueName: \"kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg\") pod \"neutron-6c9c7f6769-wqdz8\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.211591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.406174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jk7m" event={"ID":"cc260978-d229-43c0-b836-d1bb5a308c48","Type":"ContainerStarted","Data":"8c9662de16e2566b8f1665037d701d89f58326fbb3a2df5a70f250121d7c8b78"} Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.408103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" event={"ID":"ba263b61-80d7-4745-9bc1-e190a16d59b3","Type":"ContainerStarted","Data":"85cbd4bcb247893eb0af3732d16187e313a00fe300aa6bc054ea07d48613eae8"} Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.413704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bdnb6" event={"ID":"bbb8013b-627b-4894-945e-178871516870","Type":"ContainerStarted","Data":"690cced1a5a0dbfb02077ba065f80fc5822805a01aceb0b9a44613b0512fb953"} Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.448215 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bdnb6" podStartSLOduration=4.411044454 podStartE2EDuration="35.448196302s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="2026-03-17 00:44:20.215588528 +0000 UTC m=+1334.975040801" lastFinishedPulling="2026-03-17 00:44:51.252740356 +0000 UTC m=+1366.012192649" observedRunningTime="2026-03-17 00:44:53.431023043 +0000 UTC m=+1368.190475326" watchObservedRunningTime="2026-03-17 00:44:53.448196302 +0000 UTC m=+1368.207648585" Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.643917 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:44:53 crc kubenswrapper[4755]: I0317 00:44:53.882751 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:44:53 crc kubenswrapper[4755]: W0317 00:44:53.883286 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9929a6e1_b83a_4fc4_b904_9a3acc9d2c5c.slice/crio-b3934471cfc14bbc60a6409f5bd09aa29948d6ce09f22b23377314bc5d42246f WatchSource:0}: Error finding container b3934471cfc14bbc60a6409f5bd09aa29948d6ce09f22b23377314bc5d42246f: Status 404 returned error can't find the container with id b3934471cfc14bbc60a6409f5bd09aa29948d6ce09f22b23377314bc5d42246f Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.423854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xksp4" event={"ID":"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f","Type":"ContainerStarted","Data":"354c32f219635da720beb2dcc98b1faf3ad99b7ef375f78c3d4fac7e53ba25eb"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.425808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerStarted","Data":"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.427640 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerID="5794438b53ba7b71d207bd1f05fe59c48d176d53292bf71cd10cfaf5854f70a7" exitCode=0 Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.427683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" event={"ID":"ba263b61-80d7-4745-9bc1-e190a16d59b3","Type":"ContainerDied","Data":"5794438b53ba7b71d207bd1f05fe59c48d176d53292bf71cd10cfaf5854f70a7"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.430818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerStarted","Data":"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.430846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerStarted","Data":"b3934471cfc14bbc60a6409f5bd09aa29948d6ce09f22b23377314bc5d42246f"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.439611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jk7m" event={"ID":"cc260978-d229-43c0-b836-d1bb5a308c48","Type":"ContainerStarted","Data":"d57445204ca1a1169e05f3848feb0d6df6d59be07dea604f588b48c49fc4ff21"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.449137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerStarted","Data":"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.449176 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerStarted","Data":"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.449185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerStarted","Data":"36af6a2b8d72f78a16c5a90b9e7851607dc9aaee285458654617420c2ca2a353"} Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.450283 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.473890 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xksp4" podStartSLOduration=3.488409437 podStartE2EDuration="36.473874572s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="2026-03-17 00:44:20.017875414 +0000 UTC m=+1334.777327697" lastFinishedPulling="2026-03-17 00:44:53.003340559 +0000 UTC m=+1367.762792832" observedRunningTime="2026-03-17 00:44:54.440839917 +0000 UTC m=+1369.200292200" watchObservedRunningTime="2026-03-17 00:44:54.473874572 +0000 UTC m=+1369.233326855" Mar 17 00:44:54 crc kubenswrapper[4755]: I0317 00:44:54.489325 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6jk7m" podStartSLOduration=27.489302715 podStartE2EDuration="27.489302715s" podCreationTimestamp="2026-03-17 00:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:54.480276484 +0000 UTC m=+1369.239728767" watchObservedRunningTime="2026-03-17 00:44:54.489302715 +0000 UTC m=+1369.248754998" Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.466949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerStarted","Data":"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063"} Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.467214 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.472470 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" event={"ID":"ba263b61-80d7-4745-9bc1-e190a16d59b3","Type":"ContainerStarted","Data":"54b5718f55fedebeb1eaeef4e559b4c027b418d4a774f246d88051896ccdb3ef"} Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.501914 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c9c7f6769-wqdz8" podStartSLOduration=3.501894854 podStartE2EDuration="3.501894854s" podCreationTimestamp="2026-03-17 00:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:55.496097009 +0000 UTC m=+1370.255549292" watchObservedRunningTime="2026-03-17 00:44:55.501894854 +0000 UTC m=+1370.261347137" Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.506561 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-df477f8d4-2cfqn" podStartSLOduration=5.50655058 podStartE2EDuration="5.50655058s" podCreationTimestamp="2026-03-17 00:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:54.505552901 +0000 UTC m=+1369.265005184" watchObservedRunningTime="2026-03-17 00:44:55.50655058 +0000 UTC m=+1370.266002873" Mar 17 00:44:55 crc kubenswrapper[4755]: I0317 00:44:55.520466 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" podStartSLOduration=5.520451151 podStartE2EDuration="5.520451151s" podCreationTimestamp="2026-03-17 00:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:44:55.518022397 +0000 UTC m=+1370.277474680" watchObservedRunningTime="2026-03-17 00:44:55.520451151 +0000 UTC m=+1370.279903434" Mar 17 00:44:56 crc kubenswrapper[4755]: I0317 00:44:56.034687 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.498366 4755 generic.go:334] "Generic (PLEG): container finished" podID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" containerID="354c32f219635da720beb2dcc98b1faf3ad99b7ef375f78c3d4fac7e53ba25eb" exitCode=0 Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.498481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xksp4" event={"ID":"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f","Type":"ContainerDied","Data":"354c32f219635da720beb2dcc98b1faf3ad99b7ef375f78c3d4fac7e53ba25eb"} Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.502800 4755 generic.go:334] "Generic (PLEG): container finished" podID="bbb8013b-627b-4894-945e-178871516870" containerID="690cced1a5a0dbfb02077ba065f80fc5822805a01aceb0b9a44613b0512fb953" exitCode=0 Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.502844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bdnb6" event={"ID":"bbb8013b-627b-4894-945e-178871516870","Type":"ContainerDied","Data":"690cced1a5a0dbfb02077ba065f80fc5822805a01aceb0b9a44613b0512fb953"} Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.504528 4755 generic.go:334] "Generic (PLEG): container finished" podID="cc260978-d229-43c0-b836-d1bb5a308c48" containerID="d57445204ca1a1169e05f3848feb0d6df6d59be07dea604f588b48c49fc4ff21" exitCode=0 Mar 17 00:44:57 crc kubenswrapper[4755]: I0317 00:44:57.505327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jk7m" event={"ID":"cc260978-d229-43c0-b836-d1bb5a308c48","Type":"ContainerDied","Data":"d57445204ca1a1169e05f3848feb0d6df6d59be07dea604f588b48c49fc4ff21"} Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.145823 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2"] Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.148556 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.152049 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.156520 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.162120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2"] Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.293523 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.293725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.293820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptv9\" (UniqueName: \"kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.395465 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.395949 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.396041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wptv9\" (UniqueName: \"kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.396725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.403187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.413983 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptv9\" (UniqueName: \"kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9\") pod \"collect-profiles-29561805-zrbx2\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.474611 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.535740 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.542001 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xksp4" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.547671 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jk7m" event={"ID":"cc260978-d229-43c0-b836-d1bb5a308c48","Type":"ContainerDied","Data":"8c9662de16e2566b8f1665037d701d89f58326fbb3a2df5a70f250121d7c8b78"} Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.547710 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9662de16e2566b8f1665037d701d89f58326fbb3a2df5a70f250121d7c8b78" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.547720 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jk7m" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.551653 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xksp4" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.551665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xksp4" event={"ID":"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f","Type":"ContainerDied","Data":"d31022857ea381fe449069f0192303d656d712ce81a8abb8cfe64bdeca6d9ef4"} Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.551945 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31022857ea381fe449069f0192303d656d712ce81a8abb8cfe64bdeca6d9ef4" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.553756 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bdnb6" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.554361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bdnb6" event={"ID":"bbb8013b-627b-4894-945e-178871516870","Type":"ContainerDied","Data":"5f7d35f68a57b286a1d5ae442c28510b553469eb3980045507f9966883091541"} Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.554805 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f7d35f68a57b286a1d5ae442c28510b553469eb3980045507f9966883091541" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.701955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs\") pod \"bbb8013b-627b-4894-945e-178871516870\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.702356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs" (OuterVolumeSpecName: "logs") pod "bbb8013b-627b-4894-945e-178871516870" (UID: "bbb8013b-627b-4894-945e-178871516870"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.702800 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.702833 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.702915 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.702978 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s82d\" (UniqueName: \"kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d\") pod \"bbb8013b-627b-4894-945e-178871516870\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703044 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle\") pod \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703095 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle\") pod \"bbb8013b-627b-4894-945e-178871516870\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data\") pod \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703325 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts\") pod \"bbb8013b-627b-4894-945e-178871516870\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld7pz\" (UniqueName: \"kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz\") pod \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\" (UID: \"c5b76ade-1f20-43e7-bca3-cc0c70a05d4f\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703455 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd69k\" (UniqueName: \"kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data\") pod \"bbb8013b-627b-4894-945e-178871516870\" (UID: \"bbb8013b-627b-4894-945e-178871516870\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.703524 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data\") pod \"cc260978-d229-43c0-b836-d1bb5a308c48\" (UID: \"cc260978-d229-43c0-b836-d1bb5a308c48\") " Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.708819 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts" (OuterVolumeSpecName: "scripts") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.774854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d" (OuterVolumeSpecName: "kube-api-access-8s82d") pod "bbb8013b-627b-4894-945e-178871516870" (UID: "bbb8013b-627b-4894-945e-178871516870"). InnerVolumeSpecName "kube-api-access-8s82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.774965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.774959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.778682 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" (UID: "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.782500 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k" (OuterVolumeSpecName: "kube-api-access-sd69k") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "kube-api-access-sd69k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.783873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts" (OuterVolumeSpecName: "scripts") pod "bbb8013b-627b-4894-945e-178871516870" (UID: "bbb8013b-627b-4894-945e-178871516870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.784693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz" (OuterVolumeSpecName: "kube-api-access-ld7pz") pod "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" (UID: "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f"). InnerVolumeSpecName "kube-api-access-ld7pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806533 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806573 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806589 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s82d\" (UniqueName: \"kubernetes.io/projected/bbb8013b-627b-4894-945e-178871516870-kube-api-access-8s82d\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806604 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806616 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806627 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806639 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld7pz\" (UniqueName: \"kubernetes.io/projected/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-kube-api-access-ld7pz\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806651 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd69k\" (UniqueName: \"kubernetes.io/projected/cc260978-d229-43c0-b836-d1bb5a308c48-kube-api-access-sd69k\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.806662 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb8013b-627b-4894-945e-178871516870-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.812717 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data" (OuterVolumeSpecName: "config-data") pod "bbb8013b-627b-4894-945e-178871516870" (UID: "bbb8013b-627b-4894-945e-178871516870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.814773 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" (UID: "c5b76ade-1f20-43e7-bca3-cc0c70a05d4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.821422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.825018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data" (OuterVolumeSpecName: "config-data") pod "cc260978-d229-43c0-b836-d1bb5a308c48" (UID: "cc260978-d229-43c0-b836-d1bb5a308c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.826177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb8013b-627b-4894-945e-178871516870" (UID: "bbb8013b-627b-4894-945e-178871516870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.834001 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2"] Mar 17 00:45:00 crc kubenswrapper[4755]: W0317 00:45:00.836592 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8ead77_0763_41b7_8a6f_96f43f4d202b.slice/crio-66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2 WatchSource:0}: Error finding container 66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2: Status 404 returned error can't find the container with id 66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2 Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.909913 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.909941 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.909950 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc260978-d229-43c0-b836-d1bb5a308c48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.909961 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:00 crc kubenswrapper[4755]: I0317 00:45:00.909986 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb8013b-627b-4894-945e-178871516870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.036563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.096262 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.096725 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="dnsmasq-dns" containerID="cri-o://78243512a1cf8d78e7a81e1b1b7902bad2f234d3b62d75a925ff890636b4fa09" gracePeriod=10 Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.567425 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerStarted","Data":"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba"} Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.570076 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1684140-79c2-4f36-9755-7127141107ee" containerID="78243512a1cf8d78e7a81e1b1b7902bad2f234d3b62d75a925ff890636b4fa09" exitCode=0 Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.570160 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" event={"ID":"d1684140-79c2-4f36-9755-7127141107ee","Type":"ContainerDied","Data":"78243512a1cf8d78e7a81e1b1b7902bad2f234d3b62d75a925ff890636b4fa09"} Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.573970 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b8ead77-0763-41b7-8a6f-96f43f4d202b" containerID="d0c51436242d644ae2d3f6df75d794cb078867d9ca0c8872b3fb8770f317caa5" exitCode=0 Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.574085 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bdnb6" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.585038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" event={"ID":"2b8ead77-0763-41b7-8a6f-96f43f4d202b","Type":"ContainerDied","Data":"d0c51436242d644ae2d3f6df75d794cb078867d9ca0c8872b3fb8770f317caa5"} Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.585088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" event={"ID":"2b8ead77-0763-41b7-8a6f-96f43f4d202b","Type":"ContainerStarted","Data":"66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2"} Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.691467 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.755376 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-57d7fc6d98-smgzp"] Mar 17 00:45:01 crc kubenswrapper[4755]: E0317 00:45:01.756249 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb8013b-627b-4894-945e-178871516870" containerName="placement-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.756282 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb8013b-627b-4894-945e-178871516870" containerName="placement-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: E0317 00:45:01.756302 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" containerName="barbican-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.756311 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" containerName="barbican-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: E0317 00:45:01.756339 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc260978-d229-43c0-b836-d1bb5a308c48" containerName="keystone-bootstrap" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.756738 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc260978-d229-43c0-b836-d1bb5a308c48" containerName="keystone-bootstrap" Mar 17 00:45:01 crc kubenswrapper[4755]: E0317 00:45:01.756775 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="init" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.756785 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="init" Mar 17 00:45:01 crc kubenswrapper[4755]: E0317 00:45:01.756802 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="dnsmasq-dns" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.756812 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="dnsmasq-dns" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.757052 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" containerName="barbican-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.757081 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb8013b-627b-4894-945e-178871516870" containerName="placement-db-sync" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.757099 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1684140-79c2-4f36-9755-7127141107ee" containerName="dnsmasq-dns" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.757109 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc260978-d229-43c0-b836-d1bb5a308c48" containerName="keystone-bootstrap" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.760918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.772830 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57d7fc6d98-smgzp"] Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.773769 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8cwbq" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.773990 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.774105 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.774210 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.774307 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.774404 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb6ks\" (UniqueName: \"kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827341 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.827543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc\") pod \"d1684140-79c2-4f36-9755-7127141107ee\" (UID: \"d1684140-79c2-4f36-9755-7127141107ee\") " Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.841615 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.877029 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks" (OuterVolumeSpecName: "kube-api-access-mb6ks") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "kube-api-access-mb6ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.904968 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.914940 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.915281 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.915507 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.915662 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.921392 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.931716 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8chfh" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.932291 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-fernet-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.932334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vms2h\" (UniqueName: \"kubernetes.io/projected/b8c11156-3bb6-45fb-aea6-c00316f50ef4-kube-api-access-vms2h\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.932410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-internal-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.938725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-credential-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.947835 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-combined-ca-bundle\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.947922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-public-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.947987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-scripts\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.948006 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-config-data\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:01 crc kubenswrapper[4755]: I0317 00:45:01.994295 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config" (OuterVolumeSpecName: "config") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:01.998919 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.000875 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.002124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.004115 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb6ks\" (UniqueName: \"kubernetes.io/projected/d1684140-79c2-4f36-9755-7127141107ee-kube-api-access-mb6ks\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.023273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.050686 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.056620 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.057091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.057258 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ll7r8" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.067755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.067792 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1684140-79c2-4f36-9755-7127141107ee" (UID: "d1684140-79c2-4f36-9755-7127141107ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.087654 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.089245 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-fernet-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vms2h\" (UniqueName: \"kubernetes.io/projected/b8c11156-3bb6-45fb-aea6-c00316f50ef4-kube-api-access-vms2h\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dlj\" (UniqueName: \"kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108333 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108380 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108403 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-internal-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108423 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-credential-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-combined-ca-bundle\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108524 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-public-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-scripts\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108574 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-config-data\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.108715 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.117629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.117705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.117824 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.118046 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.118070 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.118084 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.118096 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1684140-79c2-4f36-9755-7127141107ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.123541 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.134495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-scripts\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.136333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-public-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.136353 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-config-data\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.139010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-combined-ca-bundle\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.142924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-credential-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.143707 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-fernet-keys\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.144140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c11156-3bb6-45fb-aea6-c00316f50ef4-internal-tls-certs\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.161368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vms2h\" (UniqueName: \"kubernetes.io/projected/b8c11156-3bb6-45fb-aea6-c00316f50ef4-kube-api-access-vms2h\") pod \"keystone-57d7fc6d98-smgzp\" (UID: \"b8c11156-3bb6-45fb-aea6-c00316f50ef4\") " pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.179533 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.181261 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.202504 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-79df59b454-zg6ts"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.204305 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219381 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219506 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219532 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tt6v\" (UniqueName: \"kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219579 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219656 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219671 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtgw\" (UniqueName: \"kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dlj\" (UniqueName: \"kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.219782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.222301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.238004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.239310 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b59c459f5-btmbt"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.240996 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.244414 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.253800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dlj\" (UniqueName: \"kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.270190 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.271197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.271877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data\") pod \"placement-55ff8ddfc6-wbxxp\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.304937 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79df59b454-zg6ts"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.312580 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327072 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data-custom\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxkq\" (UniqueName: \"kubernetes.io/projected/08712ad9-353f-4d69-aa69-87586a0b9ee3-kube-api-access-5bxkq\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327177 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327195 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtgw\" (UniqueName: \"kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327209 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08712ad9-353f-4d69-aa69-87586a0b9ee3-logs\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327228 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327254 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-combined-ca-bundle\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s77d\" (UniqueName: \"kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327384 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tt6v\" (UniqueName: \"kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327548 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.327566 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.352237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.353933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.355336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.356177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.374874 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b59c459f5-btmbt"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.376885 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.377184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.378336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.385684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.388384 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtgw\" (UniqueName: \"kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw\") pod \"barbican-worker-7579bb5c6c-sb22k\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.396817 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.399017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tt6v\" (UniqueName: \"kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.421749 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data\") pod \"barbican-keystone-listener-7967fcdf5d-jqr46\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data-custom\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429758 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxkq\" (UniqueName: \"kubernetes.io/projected/08712ad9-353f-4d69-aa69-87586a0b9ee3-kube-api-access-5bxkq\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08712ad9-353f-4d69-aa69-87586a0b9ee3-logs\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429882 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429906 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-combined-ca-bundle\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429955 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s77d\" (UniqueName: \"kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.429988 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data-custom\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430072 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qfk\" (UniqueName: \"kubernetes.io/projected/32b26d97-7256-4841-819f-2a2ee7ff2e3b-kube-api-access-n5qfk\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430148 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b26d97-7256-4841-819f-2a2ee7ff2e3b-logs\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430179 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-combined-ca-bundle\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430246 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.430316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.431267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.437305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08712ad9-353f-4d69-aa69-87586a0b9ee3-logs\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.445242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.446626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.447503 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.447630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.447696 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.448498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.456630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-combined-ca-bundle\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.478203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08712ad9-353f-4d69-aa69-87586a0b9ee3-config-data-custom\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.478715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.504017 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f66764f8d-z7959"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.505759 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.531879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data-custom\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.531933 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qfk\" (UniqueName: \"kubernetes.io/projected/32b26d97-7256-4841-819f-2a2ee7ff2e3b-kube-api-access-n5qfk\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.531969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b26d97-7256-4841-819f-2a2ee7ff2e3b-logs\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.531988 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-combined-ca-bundle\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.532024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.534203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32b26d97-7256-4841-819f-2a2ee7ff2e3b-logs\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.537714 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f66764f8d-z7959"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.544128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.563231 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxkq\" (UniqueName: \"kubernetes.io/projected/08712ad9-353f-4d69-aa69-87586a0b9ee3-kube-api-access-5bxkq\") pod \"barbican-keystone-listener-79df59b454-zg6ts\" (UID: \"08712ad9-353f-4d69-aa69-87586a0b9ee3\") " pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.563691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-config-data-custom\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.573297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qfk\" (UniqueName: \"kubernetes.io/projected/32b26d97-7256-4841-819f-2a2ee7ff2e3b-kube-api-access-n5qfk\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.582244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.634851 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-scripts\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636384 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-config-data\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2t6\" (UniqueName: \"kubernetes.io/projected/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-kube-api-access-9c2t6\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-public-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-internal-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-logs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.636811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-combined-ca-bundle\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.638037 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b26d97-7256-4841-819f-2a2ee7ff2e3b-combined-ca-bundle\") pod \"barbican-worker-7b59c459f5-btmbt\" (UID: \"32b26d97-7256-4841-819f-2a2ee7ff2e3b\") " pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.638265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s77d\" (UniqueName: \"kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d\") pod \"dnsmasq-dns-75c8ddd69c-nbd46\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.681477 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.683012 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.687229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.703855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.709937 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.710120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-82q9v" event={"ID":"d1684140-79c2-4f36-9755-7127141107ee","Type":"ContainerDied","Data":"7062e66d8277c94d137d5e13f22df99f1ee5f726ae527c793909f794994ea37f"} Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.710181 4755 scope.go:117] "RemoveContainer" containerID="78243512a1cf8d78e7a81e1b1b7902bad2f234d3b62d75a925ff890636b4fa09" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.746892 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-public-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.746931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-internal-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.746969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747002 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-logs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747076 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-combined-ca-bundle\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747173 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-scripts\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-config-data\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv7qb\" (UniqueName: \"kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.747309 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2t6\" (UniqueName: \"kubernetes.io/projected/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-kube-api-access-9c2t6\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.751404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-logs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.774513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-combined-ca-bundle\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.780643 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-public-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.780892 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-scripts\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.781227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-internal-tls-certs\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.792253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2t6\" (UniqueName: \"kubernetes.io/projected/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-kube-api-access-9c2t6\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.800802 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a-config-data\") pod \"placement-5f66764f8d-z7959\" (UID: \"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a\") " pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.808906 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.817668 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.823054 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-82q9v"] Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.849562 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.849625 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv7qb\" (UniqueName: \"kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.849687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.849703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.849761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.850273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.877175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.891543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.908012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv7qb\" (UniqueName: \"kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.908094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom\") pod \"barbican-api-57f5dc57cd-47npv\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.920420 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b59c459f5-btmbt" Mar 17 00:45:02 crc kubenswrapper[4755]: I0317 00:45:02.972934 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.049879 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.508278 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57d7fc6d98-smgzp"] Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.539226 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.603887 4755 scope.go:117] "RemoveContainer" containerID="f35b6cab4b5eb56e0e5e5a52eb208f5ac9f82ca8071a9ae07422d623a4ab5712" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.648637 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.738406 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.775486 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57d7fc6d98-smgzp" event={"ID":"b8c11156-3bb6-45fb-aea6-c00316f50ef4","Type":"ContainerStarted","Data":"4f1b6fda6076d98b769befd587cc1a24ab7e3c0db84b3d97106b8297d200f192"} Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.778700 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.794591 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" event={"ID":"2b8ead77-0763-41b7-8a6f-96f43f4d202b","Type":"ContainerDied","Data":"66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2"} Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.794632 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a646c52385a5c5c170d1273f7b22c22bb2b2ca40145626b31b11139c5e6de2" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.794699 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.796383 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerStarted","Data":"774008ef0b82fdf129c6f3da8a66853d50e286c9432efd67896032413e6caab0"} Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.796952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79df59b454-zg6ts"] Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.797450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerStarted","Data":"39d308770b5677998acf57d2d1aa89391d1f3815ef9ed058c2b52d43ef5eee3a"} Mar 17 00:45:03 crc kubenswrapper[4755]: W0317 00:45:03.823933 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08712ad9_353f_4d69_aa69_87586a0b9ee3.slice/crio-3387737cff815fd58b4b6bd677404be0ac4e4d49b1f27b6b759ee548de998710 WatchSource:0}: Error finding container 3387737cff815fd58b4b6bd677404be0ac4e4d49b1f27b6b759ee548de998710: Status 404 returned error can't find the container with id 3387737cff815fd58b4b6bd677404be0ac4e4d49b1f27b6b759ee548de998710 Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.884189 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume\") pod \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.885398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume\") pod \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.885460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptv9\" (UniqueName: \"kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9\") pod \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\" (UID: \"2b8ead77-0763-41b7-8a6f-96f43f4d202b\") " Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.885305 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b8ead77-0763-41b7-8a6f-96f43f4d202b" (UID: "2b8ead77-0763-41b7-8a6f-96f43f4d202b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.921639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b8ead77-0763-41b7-8a6f-96f43f4d202b" (UID: "2b8ead77-0763-41b7-8a6f-96f43f4d202b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.922220 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9" (OuterVolumeSpecName: "kube-api-access-wptv9") pod "2b8ead77-0763-41b7-8a6f-96f43f4d202b" (UID: "2b8ead77-0763-41b7-8a6f-96f43f4d202b"). InnerVolumeSpecName "kube-api-access-wptv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.990995 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b8ead77-0763-41b7-8a6f-96f43f4d202b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.991055 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b8ead77-0763-41b7-8a6f-96f43f4d202b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:03 crc kubenswrapper[4755]: I0317 00:45:03.991065 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wptv9\" (UniqueName: \"kubernetes.io/projected/2b8ead77-0763-41b7-8a6f-96f43f4d202b-kube-api-access-wptv9\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.195899 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.281183 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1684140-79c2-4f36-9755-7127141107ee" path="/var/lib/kubelet/pods/d1684140-79c2-4f36-9755-7127141107ee/volumes" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.351993 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b59c459f5-btmbt"] Mar 17 00:45:04 crc kubenswrapper[4755]: W0317 00:45:04.423733 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32b26d97_7256_4841_819f_2a2ee7ff2e3b.slice/crio-c3013dda415dc15889035b71e498d349ca94dd08a09a77e9c5f4169123362083 WatchSource:0}: Error finding container c3013dda415dc15889035b71e498d349ca94dd08a09a77e9c5f4169123362083: Status 404 returned error can't find the container with id c3013dda415dc15889035b71e498d349ca94dd08a09a77e9c5f4169123362083 Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.475946 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.491491 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f66764f8d-z7959"] Mar 17 00:45:04 crc kubenswrapper[4755]: W0317 00:45:04.498674 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf5592f_34fd_41f6_bc9f_b4bdb8ceff4a.slice/crio-dab1d1db91cc503bcc3d20dd28d73beb032358f96e508277784ad88557a46378 WatchSource:0}: Error finding container dab1d1db91cc503bcc3d20dd28d73beb032358f96e508277784ad88557a46378: Status 404 returned error can't find the container with id dab1d1db91cc503bcc3d20dd28d73beb032358f96e508277784ad88557a46378 Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.840023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerStarted","Data":"da78940285286c83ff778b7bbcd9c9c3710f34ed475fe732b424607a3c30eb69"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.842190 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f456494-50c9-440c-b099-15b315cb246d" containerID="030eeadfdbde2dcf97adc60de3d5decf4f9f6f7bc6c153a7382b08122bc054a7" exitCode=0 Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.842264 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" event={"ID":"2f456494-50c9-440c-b099-15b315cb246d","Type":"ContainerDied","Data":"030eeadfdbde2dcf97adc60de3d5decf4f9f6f7bc6c153a7382b08122bc054a7"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.842294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" event={"ID":"2f456494-50c9-440c-b099-15b315cb246d","Type":"ContainerStarted","Data":"b8c74b1b74b7b3dc064a31687a8625e30bc7dd36c4bafc45901c137a9cefc4e0"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.863507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerStarted","Data":"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.863542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerStarted","Data":"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.864110 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.864266 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.866416 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" event={"ID":"08712ad9-353f-4d69-aa69-87586a0b9ee3","Type":"ContainerStarted","Data":"3387737cff815fd58b4b6bd677404be0ac4e4d49b1f27b6b759ee548de998710"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.870038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerStarted","Data":"1248d1196fb887844c7b40f4d75617bf14ce73074192fcfd661a001007427461"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.877739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b59c459f5-btmbt" event={"ID":"32b26d97-7256-4841-819f-2a2ee7ff2e3b","Type":"ContainerStarted","Data":"c3013dda415dc15889035b71e498d349ca94dd08a09a77e9c5f4169123362083"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.883801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57d7fc6d98-smgzp" event={"ID":"b8c11156-3bb6-45fb-aea6-c00316f50ef4","Type":"ContainerStarted","Data":"bb76289130aff68f76f3010795e23b3449f104702c808f24526af1391bffed5c"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.885000 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.909656 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66764f8d-z7959" event={"ID":"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a","Type":"ContainerStarted","Data":"dab1d1db91cc503bcc3d20dd28d73beb032358f96e508277784ad88557a46378"} Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.917267 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55ff8ddfc6-wbxxp" podStartSLOduration=3.917244914 podStartE2EDuration="3.917244914s" podCreationTimestamp="2026-03-17 00:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:04.898652505 +0000 UTC m=+1379.658104788" watchObservedRunningTime="2026-03-17 00:45:04.917244914 +0000 UTC m=+1379.676697197" Mar 17 00:45:04 crc kubenswrapper[4755]: I0317 00:45:04.962824 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-57d7fc6d98-smgzp" podStartSLOduration=3.962800653 podStartE2EDuration="3.962800653s" podCreationTimestamp="2026-03-17 00:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:04.941655017 +0000 UTC m=+1379.701107300" watchObservedRunningTime="2026-03-17 00:45:04.962800653 +0000 UTC m=+1379.722252936" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.940721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gdjsh" event={"ID":"2315f493-9035-4185-b615-e7eed6a246ea","Type":"ContainerStarted","Data":"f0631b380191fd12a56a5aafc1976c097929edd1d24bc1fda940c77bb0b04afe"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.955220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66764f8d-z7959" event={"ID":"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a","Type":"ContainerStarted","Data":"dc53d677d2f058d6d209e1d9b977e811bb2502c8d394e1eaa3ee0450a66a10c6"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.955284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66764f8d-z7959" event={"ID":"abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a","Type":"ContainerStarted","Data":"21e01adaa6466781935e1506265fa3194d4c91ef5fa3955a2578014c10fbfe51"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.955612 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.955796 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.968006 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-gdjsh" podStartSLOduration=3.552427963 podStartE2EDuration="47.967978054s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="2026-03-17 00:44:20.074145262 +0000 UTC m=+1334.833597545" lastFinishedPulling="2026-03-17 00:45:04.489695353 +0000 UTC m=+1379.249147636" observedRunningTime="2026-03-17 00:45:05.966899405 +0000 UTC m=+1380.726351678" watchObservedRunningTime="2026-03-17 00:45:05.967978054 +0000 UTC m=+1380.727430337" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.969997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerStarted","Data":"8ff8087f65dccb70325e05d6f0c0c9881bea2368a409bd5e68d5775e4185d365"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.970039 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerStarted","Data":"224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.970857 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.970890 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.990117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" event={"ID":"2f456494-50c9-440c-b099-15b315cb246d","Type":"ContainerStarted","Data":"5244ab0c426eff1a2ca9572ab94353395075a57e018a6e331aea9d5f8d2629b2"} Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.990673 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:05 crc kubenswrapper[4755]: I0317 00:45:05.994700 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f66764f8d-z7959" podStartSLOduration=3.994681249 podStartE2EDuration="3.994681249s" podCreationTimestamp="2026-03-17 00:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:05.988495444 +0000 UTC m=+1380.747947727" watchObservedRunningTime="2026-03-17 00:45:05.994681249 +0000 UTC m=+1380.754133532" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.016405 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57f5dc57cd-47npv" podStartSLOduration=4.016389101 podStartE2EDuration="4.016389101s" podCreationTimestamp="2026-03-17 00:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:06.014001696 +0000 UTC m=+1380.773453979" watchObservedRunningTime="2026-03-17 00:45:06.016389101 +0000 UTC m=+1380.775841384" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.039870 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" podStartSLOduration=4.039851559 podStartE2EDuration="4.039851559s" podCreationTimestamp="2026-03-17 00:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:06.035885463 +0000 UTC m=+1380.795337746" watchObservedRunningTime="2026-03-17 00:45:06.039851559 +0000 UTC m=+1380.799303842" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.102790 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75cc5bb54-mqs8w"] Mar 17 00:45:06 crc kubenswrapper[4755]: E0317 00:45:06.103226 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8ead77-0763-41b7-8a6f-96f43f4d202b" containerName="collect-profiles" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.103241 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8ead77-0763-41b7-8a6f-96f43f4d202b" containerName="collect-profiles" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.103430 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8ead77-0763-41b7-8a6f-96f43f4d202b" containerName="collect-profiles" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.104522 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.112763 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.112976 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.137238 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75cc5bb54-mqs8w"] Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165197 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e975409f-81b6-4bcd-aec0-00f942eae3bd-logs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165274 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165333 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q67f8\" (UniqueName: \"kubernetes.io/projected/e975409f-81b6-4bcd-aec0-00f942eae3bd-kube-api-access-q67f8\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165353 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-internal-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-combined-ca-bundle\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-public-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.165486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data-custom\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-combined-ca-bundle\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-public-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270659 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data-custom\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270692 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e975409f-81b6-4bcd-aec0-00f942eae3bd-logs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-internal-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.270805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q67f8\" (UniqueName: \"kubernetes.io/projected/e975409f-81b6-4bcd-aec0-00f942eae3bd-kube-api-access-q67f8\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.271782 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e975409f-81b6-4bcd-aec0-00f942eae3bd-logs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.277090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data-custom\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.277404 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.278475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-combined-ca-bundle\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.279705 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.286181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-config-data\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.296579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q67f8\" (UniqueName: \"kubernetes.io/projected/e975409f-81b6-4bcd-aec0-00f942eae3bd-kube-api-access-q67f8\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.298239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-internal-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.309083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e975409f-81b6-4bcd-aec0-00f942eae3bd-public-tls-certs\") pod \"barbican-api-75cc5bb54-mqs8w\" (UID: \"e975409f-81b6-4bcd-aec0-00f942eae3bd\") " pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:06 crc kubenswrapper[4755]: I0317 00:45:06.438655 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:07 crc kubenswrapper[4755]: I0317 00:45:07.002491 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v7jsc" event={"ID":"404d7b5a-9c59-4c63-b3be-740554b83374","Type":"ContainerStarted","Data":"e5fe31b2f16c24183bd88dee6944e309af9190e1fbb6423f2b4e81797f8d3670"} Mar 17 00:45:07 crc kubenswrapper[4755]: I0317 00:45:07.023804 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-v7jsc" podStartSLOduration=4.684035418 podStartE2EDuration="49.02378702s" podCreationTimestamp="2026-03-17 00:44:18 +0000 UTC" firstStartedPulling="2026-03-17 00:44:20.077777108 +0000 UTC m=+1334.837229391" lastFinishedPulling="2026-03-17 00:45:04.41752871 +0000 UTC m=+1379.176980993" observedRunningTime="2026-03-17 00:45:07.019913807 +0000 UTC m=+1381.779366090" watchObservedRunningTime="2026-03-17 00:45:07.02378702 +0000 UTC m=+1381.783239293" Mar 17 00:45:07 crc kubenswrapper[4755]: I0317 00:45:07.942764 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75cc5bb54-mqs8w"] Mar 17 00:45:07 crc kubenswrapper[4755]: W0317 00:45:07.958637 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode975409f_81b6_4bcd_aec0_00f942eae3bd.slice/crio-7f9cd56b64ae9a1b5d8597844ea3f138467badd9abfa259f5714211055fca2dd WatchSource:0}: Error finding container 7f9cd56b64ae9a1b5d8597844ea3f138467badd9abfa259f5714211055fca2dd: Status 404 returned error can't find the container with id 7f9cd56b64ae9a1b5d8597844ea3f138467badd9abfa259f5714211055fca2dd Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.049372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b59c459f5-btmbt" event={"ID":"32b26d97-7256-4841-819f-2a2ee7ff2e3b","Type":"ContainerStarted","Data":"87dc697dd93d492bd24a06dfc87442f5003638c3593534073799df89fc5ea259"} Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.059835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerStarted","Data":"4a9841768d1c6c113d14885c2406467446cfa67fad227c72bd7b8429c681a5a0"} Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.059878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerStarted","Data":"a6a20c3ca1008f78e35d0fdc421742a892627f6b4b6963dcfbe9e6abbefbd8cf"} Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.065499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" event={"ID":"08712ad9-353f-4d69-aa69-87586a0b9ee3","Type":"ContainerStarted","Data":"d96bfba274799acedf04e4bccf5644d7e0b6f6b4ebc5286becd7d11ce82812c5"} Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.067578 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerStarted","Data":"3264ac7df8ac7f589b0f5ce9b34ff8e33cc1581c4a5ec895969c1227763c0a32"} Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.075788 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7579bb5c6c-sb22k" podStartSLOduration=3.499674341 podStartE2EDuration="7.075771735s" podCreationTimestamp="2026-03-17 00:45:01 +0000 UTC" firstStartedPulling="2026-03-17 00:45:03.723575825 +0000 UTC m=+1378.483028108" lastFinishedPulling="2026-03-17 00:45:07.299673189 +0000 UTC m=+1382.059125502" observedRunningTime="2026-03-17 00:45:08.074244024 +0000 UTC m=+1382.833696307" watchObservedRunningTime="2026-03-17 00:45:08.075771735 +0000 UTC m=+1382.835224018" Mar 17 00:45:08 crc kubenswrapper[4755]: I0317 00:45:08.078121 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75cc5bb54-mqs8w" event={"ID":"e975409f-81b6-4bcd-aec0-00f942eae3bd","Type":"ContainerStarted","Data":"7f9cd56b64ae9a1b5d8597844ea3f138467badd9abfa259f5714211055fca2dd"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.091599 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75cc5bb54-mqs8w" event={"ID":"e975409f-81b6-4bcd-aec0-00f942eae3bd","Type":"ContainerStarted","Data":"1a671b85ed6bf27cd0a08c637dd34b2950315075bae98e306c789b0a0e46620a"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.091923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75cc5bb54-mqs8w" event={"ID":"e975409f-81b6-4bcd-aec0-00f942eae3bd","Type":"ContainerStarted","Data":"81ace03b77dfb5825764ebe89d491b8880d7260768f68891ca0acb9f3479a801"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.092158 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.092195 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.101293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b59c459f5-btmbt" event={"ID":"32b26d97-7256-4841-819f-2a2ee7ff2e3b","Type":"ContainerStarted","Data":"99d629ee2fd29039e7295b16320b62c4e8f30c3e8baa5b27614da7a644f0cd92"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.110300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" event={"ID":"08712ad9-353f-4d69-aa69-87586a0b9ee3","Type":"ContainerStarted","Data":"8e4befee9bcc7b8740b49f8f8c866173e04b1fe5c8519164682971e408a02376"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.116416 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75cc5bb54-mqs8w" podStartSLOduration=3.116400314 podStartE2EDuration="3.116400314s" podCreationTimestamp="2026-03-17 00:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:09.114969536 +0000 UTC m=+1383.874421819" watchObservedRunningTime="2026-03-17 00:45:09.116400314 +0000 UTC m=+1383.875852597" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.126944 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerStarted","Data":"b776fe73d3b89f4bf8c97dbf6f45d851c37770fd56afbb03d8cca66eaa9c98c7"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.131297 4755 generic.go:334] "Generic (PLEG): container finished" podID="2315f493-9035-4185-b615-e7eed6a246ea" containerID="f0631b380191fd12a56a5aafc1976c097929edd1d24bc1fda940c77bb0b04afe" exitCode=0 Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.132253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gdjsh" event={"ID":"2315f493-9035-4185-b615-e7eed6a246ea","Type":"ContainerDied","Data":"f0631b380191fd12a56a5aafc1976c097929edd1d24bc1fda940c77bb0b04afe"} Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.156277 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-79df59b454-zg6ts" podStartSLOduration=3.673953519 podStartE2EDuration="7.156253641s" podCreationTimestamp="2026-03-17 00:45:02 +0000 UTC" firstStartedPulling="2026-03-17 00:45:03.865118825 +0000 UTC m=+1378.624571108" lastFinishedPulling="2026-03-17 00:45:07.347418947 +0000 UTC m=+1382.106871230" observedRunningTime="2026-03-17 00:45:09.144140327 +0000 UTC m=+1383.903592610" watchObservedRunningTime="2026-03-17 00:45:09.156253641 +0000 UTC m=+1383.915705924" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.178380 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b59c459f5-btmbt" podStartSLOduration=4.314560866 podStartE2EDuration="7.178360323s" podCreationTimestamp="2026-03-17 00:45:02 +0000 UTC" firstStartedPulling="2026-03-17 00:45:04.478961206 +0000 UTC m=+1379.238413489" lastFinishedPulling="2026-03-17 00:45:07.342760673 +0000 UTC m=+1382.102212946" observedRunningTime="2026-03-17 00:45:09.163983539 +0000 UTC m=+1383.923435832" watchObservedRunningTime="2026-03-17 00:45:09.178360323 +0000 UTC m=+1383.937812596" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.197483 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.206620 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" podStartSLOduration=4.728891332 podStartE2EDuration="8.2066011s" podCreationTimestamp="2026-03-17 00:45:01 +0000 UTC" firstStartedPulling="2026-03-17 00:45:03.865448295 +0000 UTC m=+1378.624900578" lastFinishedPulling="2026-03-17 00:45:07.343158063 +0000 UTC m=+1382.102610346" observedRunningTime="2026-03-17 00:45:09.186203044 +0000 UTC m=+1383.945655327" watchObservedRunningTime="2026-03-17 00:45:09.2066011 +0000 UTC m=+1383.966053383" Mar 17 00:45:09 crc kubenswrapper[4755]: I0317 00:45:09.218266 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:10 crc kubenswrapper[4755]: I0317 00:45:10.147325 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7579bb5c6c-sb22k" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker-log" containerID="cri-o://a6a20c3ca1008f78e35d0fdc421742a892627f6b4b6963dcfbe9e6abbefbd8cf" gracePeriod=30 Mar 17 00:45:10 crc kubenswrapper[4755]: I0317 00:45:10.147420 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7579bb5c6c-sb22k" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker" containerID="cri-o://4a9841768d1c6c113d14885c2406467446cfa67fad227c72bd7b8429c681a5a0" gracePeriod=30 Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.160393 4755 generic.go:334] "Generic (PLEG): container finished" podID="404d7b5a-9c59-4c63-b3be-740554b83374" containerID="e5fe31b2f16c24183bd88dee6944e309af9190e1fbb6423f2b4e81797f8d3670" exitCode=0 Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.160549 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v7jsc" event={"ID":"404d7b5a-9c59-4c63-b3be-740554b83374","Type":"ContainerDied","Data":"e5fe31b2f16c24183bd88dee6944e309af9190e1fbb6423f2b4e81797f8d3670"} Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164612 4755 generic.go:334] "Generic (PLEG): container finished" podID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerID="4a9841768d1c6c113d14885c2406467446cfa67fad227c72bd7b8429c681a5a0" exitCode=0 Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164639 4755 generic.go:334] "Generic (PLEG): container finished" podID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerID="a6a20c3ca1008f78e35d0fdc421742a892627f6b4b6963dcfbe9e6abbefbd8cf" exitCode=143 Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerDied","Data":"4a9841768d1c6c113d14885c2406467446cfa67fad227c72bd7b8429c681a5a0"} Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerDied","Data":"a6a20c3ca1008f78e35d0fdc421742a892627f6b4b6963dcfbe9e6abbefbd8cf"} Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164808 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener-log" containerID="cri-o://3264ac7df8ac7f589b0f5ce9b34ff8e33cc1581c4a5ec895969c1227763c0a32" gracePeriod=30 Mar 17 00:45:11 crc kubenswrapper[4755]: I0317 00:45:11.164837 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener" containerID="cri-o://b776fe73d3b89f4bf8c97dbf6f45d851c37770fd56afbb03d8cca66eaa9c98c7" gracePeriod=30 Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.175353 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerID="b776fe73d3b89f4bf8c97dbf6f45d851c37770fd56afbb03d8cca66eaa9c98c7" exitCode=0 Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.175410 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerID="3264ac7df8ac7f589b0f5ce9b34ff8e33cc1581c4a5ec895969c1227763c0a32" exitCode=143 Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.175632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerDied","Data":"b776fe73d3b89f4bf8c97dbf6f45d851c37770fd56afbb03d8cca66eaa9c98c7"} Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.175665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerDied","Data":"3264ac7df8ac7f589b0f5ce9b34ff8e33cc1581c4a5ec895969c1227763c0a32"} Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.810624 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.870505 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:45:12 crc kubenswrapper[4755]: I0317 00:45:12.870723 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="dnsmasq-dns" containerID="cri-o://54b5718f55fedebeb1eaeef4e559b4c027b418d4a774f246d88051896ccdb3ef" gracePeriod=10 Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.215300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gdjsh" event={"ID":"2315f493-9035-4185-b615-e7eed6a246ea","Type":"ContainerDied","Data":"06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5"} Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.215666 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06db9a6034f41e78d6a1d01023adc6401c1f6369d9b94be59a5e4b99ab9882b5" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.256081 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-v7jsc" event={"ID":"404d7b5a-9c59-4c63-b3be-740554b83374","Type":"ContainerDied","Data":"268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274"} Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.256124 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268ee6dc89dc8b8cbaec776063d10eb60476b4cbc658a2b8e2c792c9facaa274" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.260837 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerID="54b5718f55fedebeb1eaeef4e559b4c027b418d4a774f246d88051896ccdb3ef" exitCode=0 Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.260869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" event={"ID":"ba263b61-80d7-4745-9bc1-e190a16d59b3","Type":"ContainerDied","Data":"54b5718f55fedebeb1eaeef4e559b4c027b418d4a774f246d88051896ccdb3ef"} Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.288364 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gdjsh" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.305282 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452097 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle\") pod \"2315f493-9035-4185-b615-e7eed6a246ea\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452165 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqx68\" (UniqueName: \"kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452252 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452390 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452468 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5t76\" (UniqueName: \"kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76\") pod \"2315f493-9035-4185-b615-e7eed6a246ea\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data\") pod \"2315f493-9035-4185-b615-e7eed6a246ea\" (UID: \"2315f493-9035-4185-b615-e7eed6a246ea\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.452672 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle\") pod \"404d7b5a-9c59-4c63-b3be-740554b83374\" (UID: \"404d7b5a-9c59-4c63-b3be-740554b83374\") " Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.453571 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.458601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.458700 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76" (OuterVolumeSpecName: "kube-api-access-w5t76") pod "2315f493-9035-4185-b615-e7eed6a246ea" (UID: "2315f493-9035-4185-b615-e7eed6a246ea"). InnerVolumeSpecName "kube-api-access-w5t76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.460456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts" (OuterVolumeSpecName: "scripts") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.460864 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68" (OuterVolumeSpecName: "kube-api-access-vqx68") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "kube-api-access-vqx68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.524003 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2315f493-9035-4185-b615-e7eed6a246ea" (UID: "2315f493-9035-4185-b615-e7eed6a246ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.530750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.556836 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5t76\" (UniqueName: \"kubernetes.io/projected/2315f493-9035-4185-b615-e7eed6a246ea-kube-api-access-w5t76\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557071 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/404d7b5a-9c59-4c63-b3be-740554b83374-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557131 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557189 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557245 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqx68\" (UniqueName: \"kubernetes.io/projected/404d7b5a-9c59-4c63-b3be-740554b83374-kube-api-access-vqx68\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557299 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.557354 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.572543 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data" (OuterVolumeSpecName: "config-data") pod "404d7b5a-9c59-4c63-b3be-740554b83374" (UID: "404d7b5a-9c59-4c63-b3be-740554b83374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.647683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data" (OuterVolumeSpecName: "config-data") pod "2315f493-9035-4185-b615-e7eed6a246ea" (UID: "2315f493-9035-4185-b615-e7eed6a246ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.659025 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2315f493-9035-4185-b615-e7eed6a246ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:13 crc kubenswrapper[4755]: I0317 00:45:13.659059 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/404d7b5a-9c59-4c63-b3be-740554b83374-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.307084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579bb5c6c-sb22k" event={"ID":"c98c94a5-456b-4b83-ab36-2d56dce423cf","Type":"ContainerDied","Data":"774008ef0b82fdf129c6f3da8a66853d50e286c9432efd67896032413e6caab0"} Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.307374 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="774008ef0b82fdf129c6f3da8a66853d50e286c9432efd67896032413e6caab0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.310987 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gdjsh" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.321018 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-v7jsc" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.321286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" event={"ID":"e1c0eb85-d144-4980-a7c4-bdcd5f515053","Type":"ContainerDied","Data":"1248d1196fb887844c7b40f4d75617bf14ce73074192fcfd661a001007427461"} Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.321333 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1248d1196fb887844c7b40f4d75617bf14ce73074192fcfd661a001007427461" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.362900 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.463968 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.473875 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs\") pod \"c98c94a5-456b-4b83-ab36-2d56dce423cf\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.473966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data\") pod \"c98c94a5-456b-4b83-ab36-2d56dce423cf\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.474054 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom\") pod \"c98c94a5-456b-4b83-ab36-2d56dce423cf\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.474084 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twtgw\" (UniqueName: \"kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw\") pod \"c98c94a5-456b-4b83-ab36-2d56dce423cf\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.474229 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle\") pod \"c98c94a5-456b-4b83-ab36-2d56dce423cf\" (UID: \"c98c94a5-456b-4b83-ab36-2d56dce423cf\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.474991 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs" (OuterVolumeSpecName: "logs") pod "c98c94a5-456b-4b83-ab36-2d56dce423cf" (UID: "c98c94a5-456b-4b83-ab36-2d56dce423cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.494601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c98c94a5-456b-4b83-ab36-2d56dce423cf" (UID: "c98c94a5-456b-4b83-ab36-2d56dce423cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.509119 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw" (OuterVolumeSpecName: "kube-api-access-twtgw") pod "c98c94a5-456b-4b83-ab36-2d56dce423cf" (UID: "c98c94a5-456b-4b83-ab36-2d56dce423cf"). InnerVolumeSpecName "kube-api-access-twtgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.560253 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98c94a5-456b-4b83-ab36-2d56dce423cf" (UID: "c98c94a5-456b-4b83-ab36-2d56dce423cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.581426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle\") pod \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.581548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom\") pod \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.581758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs\") pod \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.581787 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data\") pod \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.581833 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tt6v\" (UniqueName: \"kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v\") pod \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\" (UID: \"e1c0eb85-d144-4980-a7c4-bdcd5f515053\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.582239 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.582258 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twtgw\" (UniqueName: \"kubernetes.io/projected/c98c94a5-456b-4b83-ab36-2d56dce423cf-kube-api-access-twtgw\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.582271 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.582280 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98c94a5-456b-4b83-ab36-2d56dce423cf-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.602394 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606102 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker-log" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606137 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker-log" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606166 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606174 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606200 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606206 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606224 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" containerName="cinder-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606230 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" containerName="cinder-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606237 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener-log" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606243 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener-log" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.606255 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2315f493-9035-4185-b615-e7eed6a246ea" containerName="heat-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606260 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2315f493-9035-4185-b615-e7eed6a246ea" containerName="heat-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606546 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker-log" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606561 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" containerName="cinder-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606576 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2315f493-9035-4185-b615-e7eed6a246ea" containerName="heat-db-sync" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606584 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener-log" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606593 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" containerName="barbican-keystone-listener" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.606601 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" containerName="barbican-worker" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.607644 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.611055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs" (OuterVolumeSpecName: "logs") pod "e1c0eb85-d144-4980-a7c4-bdcd5f515053" (UID: "e1c0eb85-d144-4980-a7c4-bdcd5f515053"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.616666 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v" (OuterVolumeSpecName: "kube-api-access-7tt6v") pod "e1c0eb85-d144-4980-a7c4-bdcd5f515053" (UID: "e1c0eb85-d144-4980-a7c4-bdcd5f515053"). InnerVolumeSpecName "kube-api-access-7tt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.617299 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1c0eb85-d144-4980-a7c4-bdcd5f515053" (UID: "e1c0eb85-d144-4980-a7c4-bdcd5f515053"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.624594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data" (OuterVolumeSpecName: "config-data") pod "c98c94a5-456b-4b83-ab36-2d56dce423cf" (UID: "c98c94a5-456b-4b83-ab36-2d56dce423cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.630947 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nlshq" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.631126 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.631240 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.631373 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.631400 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.634254 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.689657 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c0eb85-d144-4980-a7c4-bdcd5f515053" (UID: "e1c0eb85-d144-4980-a7c4-bdcd5f515053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.690062 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tt6v\" (UniqueName: \"kubernetes.io/projected/e1c0eb85-d144-4980-a7c4-bdcd5f515053-kube-api-access-7tt6v\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.690085 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.690094 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.690103 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98c94a5-456b-4b83-ab36-2d56dce423cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.690115 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c0eb85-d144-4980-a7c4-bdcd5f515053-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.697735 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.698191 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="init" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.698202 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="init" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.698225 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="dnsmasq-dns" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.698232 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="dnsmasq-dns" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.698408 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" containerName="dnsmasq-dns" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.699540 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.771947 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data" (OuterVolumeSpecName: "config-data") pod "e1c0eb85-d144-4980-a7c4-bdcd5f515053" (UID: "e1c0eb85-d144-4980-a7c4-bdcd5f515053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792173 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792225 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792307 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792353 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.792481 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlfnh\" (UniqueName: \"kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh\") pod \"ba263b61-80d7-4745-9bc1-e190a16d59b3\" (UID: \"ba263b61-80d7-4745-9bc1-e190a16d59b3\") " Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793196 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9qf\" (UniqueName: \"kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793312 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.793398 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c0eb85-d144-4980-a7c4-bdcd5f515053-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.813833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.821643 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh" (OuterVolumeSpecName: "kube-api-access-nlfnh") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "kube-api-access-nlfnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: E0317 00:45:14.825114 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.861199 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.869647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895135 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtr6d\" (UniqueName: \"kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9qf\" (UniqueName: \"kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895254 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895270 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895366 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895386 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895494 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlfnh\" (UniqueName: \"kubernetes.io/projected/ba263b61-80d7-4745-9bc1-e190a16d59b3-kube-api-access-nlfnh\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895504 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.895513 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.896164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.899001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.900083 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.902755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.903612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.917636 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.917661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9qf\" (UniqueName: \"kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf\") pod \"cinder-scheduler-0\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.921737 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.923916 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.927949 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.930697 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.931781 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.957216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.967334 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config" (OuterVolumeSpecName: "config") pod "ba263b61-80d7-4745-9bc1-e190a16d59b3" (UID: "ba263b61-80d7-4745-9bc1-e190a16d59b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.997252 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.997515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.998854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.998961 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.999124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtr6d\" (UniqueName: \"kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.999242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.999344 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:14 crc kubenswrapper[4755]: I0317 00:45:14.999401 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.000074 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba263b61-80d7-4745-9bc1-e190a16d59b3-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:14.998638 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.000036 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:14.998793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.000203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.000404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.014331 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtr6d\" (UniqueName: \"kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d\") pod \"dnsmasq-dns-5784cf869f-78sk9\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.068910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.076532 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104828 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b897b\" (UniqueName: \"kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104911 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.104995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.209556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.209829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.209877 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.209901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b897b\" (UniqueName: \"kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.209943 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.210001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.210036 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.212088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.216587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.222053 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.227054 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.237024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b897b\" (UniqueName: \"kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.244884 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.245121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.245215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom\") pod \"cinder-api-0\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.260814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.353594 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="ceilometer-notification-agent" containerID="cri-o://b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3" gracePeriod=30 Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.354201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerStarted","Data":"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb"} Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.354244 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.354356 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="proxy-httpd" containerID="cri-o://8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb" gracePeriod=30 Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.354527 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="sg-core" containerID="cri-o://3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba" gracePeriod=30 Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.382844 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579bb5c6c-sb22k" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.383810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" event={"ID":"ba263b61-80d7-4745-9bc1-e190a16d59b3","Type":"ContainerDied","Data":"85cbd4bcb247893eb0af3732d16187e313a00fe300aa6bc054ea07d48613eae8"} Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.383896 4755 scope.go:117] "RemoveContainer" containerID="54b5718f55fedebeb1eaeef4e559b4c027b418d4a774f246d88051896ccdb3ef" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.384109 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ddx2p" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.385381 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7967fcdf5d-jqr46" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.477916 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.488487 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ddx2p"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.496073 4755 scope.go:117] "RemoveContainer" containerID="5794438b53ba7b71d207bd1f05fe59c48d176d53292bf71cd10cfaf5854f70a7" Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.497508 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.510288 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7967fcdf5d-jqr46"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.521539 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.531174 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7579bb5c6c-sb22k"] Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.763826 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:15 crc kubenswrapper[4755]: W0317 00:45:15.769940 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod759de4a2_5c31_4f46_9fdd_21d8ed1a4a9d.slice/crio-1d9ba7363234a556f5adfeed68e979b92e35a8f3572199c82f6cdb135de2f580 WatchSource:0}: Error finding container 1d9ba7363234a556f5adfeed68e979b92e35a8f3572199c82f6cdb135de2f580: Status 404 returned error can't find the container with id 1d9ba7363234a556f5adfeed68e979b92e35a8f3572199c82f6cdb135de2f580 Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.795885 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:15 crc kubenswrapper[4755]: W0317 00:45:15.799789 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf270f16c_6af5_41f6_872d_d46aebe04b6e.slice/crio-918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93 WatchSource:0}: Error finding container 918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93: Status 404 returned error can't find the container with id 918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93 Mar 17 00:45:15 crc kubenswrapper[4755]: I0317 00:45:15.932644 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.262667 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba263b61-80d7-4745-9bc1-e190a16d59b3" path="/var/lib/kubelet/pods/ba263b61-80d7-4745-9bc1-e190a16d59b3/volumes" Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.263400 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98c94a5-456b-4b83-ab36-2d56dce423cf" path="/var/lib/kubelet/pods/c98c94a5-456b-4b83-ab36-2d56dce423cf/volumes" Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.263997 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c0eb85-d144-4980-a7c4-bdcd5f515053" path="/var/lib/kubelet/pods/e1c0eb85-d144-4980-a7c4-bdcd5f515053/volumes" Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.420241 4755 generic.go:334] "Generic (PLEG): container finished" podID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerID="8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb" exitCode=0 Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.420586 4755 generic.go:334] "Generic (PLEG): container finished" podID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerID="3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba" exitCode=2 Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.420298 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerDied","Data":"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb"} Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.420661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerDied","Data":"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba"} Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.431274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerStarted","Data":"227a22fefd34bb5c38393147ae0973d567655a6a60917735363dc9095a087a51"} Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.444403 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerStarted","Data":"8c3d05915cb3297973724d78022ac4a3f5a664a2aa08273ed0be14ccdf79d710"} Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.444488 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerStarted","Data":"918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93"} Mar 17 00:45:16 crc kubenswrapper[4755]: I0317 00:45:16.456824 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerStarted","Data":"1d9ba7363234a556f5adfeed68e979b92e35a8f3572199c82f6cdb135de2f580"} Mar 17 00:45:17 crc kubenswrapper[4755]: I0317 00:45:17.044959 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:17 crc kubenswrapper[4755]: I0317 00:45:17.470431 4755 generic.go:334] "Generic (PLEG): container finished" podID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerID="8c3d05915cb3297973724d78022ac4a3f5a664a2aa08273ed0be14ccdf79d710" exitCode=0 Mar 17 00:45:17 crc kubenswrapper[4755]: I0317 00:45:17.470468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerDied","Data":"8c3d05915cb3297973724d78022ac4a3f5a664a2aa08273ed0be14ccdf79d710"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.036532 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111026 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111197 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr9wr\" (UniqueName: \"kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111216 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111266 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111315 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.111363 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml\") pod \"756a984f-fd52-4215-b64e-ecd7c9f2851e\" (UID: \"756a984f-fd52-4215-b64e-ecd7c9f2851e\") " Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.119006 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.119266 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.119978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts" (OuterVolumeSpecName: "scripts") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.140832 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr" (OuterVolumeSpecName: "kube-api-access-hr9wr") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "kube-api-access-hr9wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.179496 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.191574 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213453 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213486 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213498 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213506 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/756a984f-fd52-4215-b64e-ecd7c9f2851e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213516 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr9wr\" (UniqueName: \"kubernetes.io/projected/756a984f-fd52-4215-b64e-ecd7c9f2851e-kube-api-access-hr9wr\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.213525 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.215186 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.235988 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data" (OuterVolumeSpecName: "config-data") pod "756a984f-fd52-4215-b64e-ecd7c9f2851e" (UID: "756a984f-fd52-4215-b64e-ecd7c9f2851e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.316639 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/756a984f-fd52-4215-b64e-ecd7c9f2851e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.483926 4755 generic.go:334] "Generic (PLEG): container finished" podID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerID="b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3" exitCode=0 Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.484209 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerDied","Data":"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.484237 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"756a984f-fd52-4215-b64e-ecd7c9f2851e","Type":"ContainerDied","Data":"841268a98b5c8a03ce02bf4f62f9ed61f1d9d6b616eb7360d12d76e9017b0a91"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.484252 4755 scope.go:117] "RemoveContainer" containerID="8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.484374 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.486951 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerStarted","Data":"89089a92aee5b16244b5d778ae3b08a05884cda0f40fa0d8a6cb63aba65f19bf"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.486980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerStarted","Data":"4d37426d1a8768fd4036e257238c7d76cf4461d24be494ae9d42a129aeef3e5a"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.487082 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api-log" containerID="cri-o://4d37426d1a8768fd4036e257238c7d76cf4461d24be494ae9d42a129aeef3e5a" gracePeriod=30 Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.487151 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.487183 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api" containerID="cri-o://89089a92aee5b16244b5d778ae3b08a05884cda0f40fa0d8a6cb63aba65f19bf" gracePeriod=30 Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.490139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerStarted","Data":"74f0fc9e6a3128782b59a3edd43d8e081bcc3e5e9b9505e3efc8d69d114df5a5"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.490882 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.494599 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerStarted","Data":"fbf06f7dd428ca825fc188b7dd6aca2f4cbf042316cdd9c0fab6379f091c93f2"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.494641 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerStarted","Data":"61aa9ccdf61e46cb0ffac78854f55c8625488d15a6954bd44f55dc4d5e36b178"} Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.520089 4755 scope.go:117] "RemoveContainer" containerID="3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.529493 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.580714694 podStartE2EDuration="4.529474713s" podCreationTimestamp="2026-03-17 00:45:14 +0000 UTC" firstStartedPulling="2026-03-17 00:45:15.773837222 +0000 UTC m=+1390.533289495" lastFinishedPulling="2026-03-17 00:45:16.722597231 +0000 UTC m=+1391.482049514" observedRunningTime="2026-03-17 00:45:18.528090195 +0000 UTC m=+1393.287542498" watchObservedRunningTime="2026-03-17 00:45:18.529474713 +0000 UTC m=+1393.288927006" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.550143 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.550111095 podStartE2EDuration="4.550111095s" podCreationTimestamp="2026-03-17 00:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:18.514239774 +0000 UTC m=+1393.273692047" watchObservedRunningTime="2026-03-17 00:45:18.550111095 +0000 UTC m=+1393.309563378" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.554052 4755 scope.go:117] "RemoveContainer" containerID="b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.564141 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" podStartSLOduration=4.564119341 podStartE2EDuration="4.564119341s" podCreationTimestamp="2026-03-17 00:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:18.554753769 +0000 UTC m=+1393.314206052" watchObservedRunningTime="2026-03-17 00:45:18.564119341 +0000 UTC m=+1393.323571624" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.631148 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.641165 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649005 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.649560 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="sg-core" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649577 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="sg-core" Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.649590 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="ceilometer-notification-agent" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649596 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="ceilometer-notification-agent" Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.649606 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="proxy-httpd" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649612 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="proxy-httpd" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649809 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="sg-core" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649831 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="proxy-httpd" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.649848 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" containerName="ceilometer-notification-agent" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.651626 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.654932 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.655114 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.671070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.677607 4755 scope.go:117] "RemoveContainer" containerID="8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb" Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.696557 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb\": container with ID starting with 8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb not found: ID does not exist" containerID="8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.696606 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb"} err="failed to get container status \"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb\": rpc error: code = NotFound desc = could not find container \"8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb\": container with ID starting with 8fd5fc52ff4e24005c2e3c91a8640d2deddff1d00bec3e5e6635c5cc48cc11eb not found: ID does not exist" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.696636 4755 scope.go:117] "RemoveContainer" containerID="3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba" Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.703659 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba\": container with ID starting with 3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba not found: ID does not exist" containerID="3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.703705 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba"} err="failed to get container status \"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba\": rpc error: code = NotFound desc = could not find container \"3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba\": container with ID starting with 3a0a7c6aacdf09177864ed3651a016dfe98266ce57c6f5071a272a3c087592ba not found: ID does not exist" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.703730 4755 scope.go:117] "RemoveContainer" containerID="b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3" Mar 17 00:45:18 crc kubenswrapper[4755]: E0317 00:45:18.709886 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3\": container with ID starting with b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3 not found: ID does not exist" containerID="b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.709922 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3"} err="failed to get container status \"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3\": rpc error: code = NotFound desc = could not find container \"b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3\": container with ID starting with b1a359514561a04c523bd8235649c07830a6fbfb9f33145acd2bfbc107475ac3 not found: ID does not exist" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.752618 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75cc5bb54-mqs8w" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.841841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842708 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.842876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9vm\" (UniqueName: \"kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.845407 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.845696 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57f5dc57cd-47npv" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api-log" containerID="cri-o://224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51" gracePeriod=30 Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.846170 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57f5dc57cd-47npv" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api" containerID="cri-o://8ff8087f65dccb70325e05d6f0c0c9881bea2368a409bd5e68d5775e4185d365" gracePeriod=30 Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955572 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955594 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9vm\" (UniqueName: \"kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.955755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.956758 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.961621 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.980848 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:18 crc kubenswrapper[4755]: I0317 00:45:18.983271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:18.999127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.035493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.066161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9vm\" (UniqueName: \"kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm\") pod \"ceilometer-0\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " pod="openstack/ceilometer-0" Mar 17 00:45:19 crc kubenswrapper[4755]: E0317 00:45:19.322700 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fedfbc_d1c9_4059_ace9_d15f9bc61053.slice/crio-224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fedfbc_d1c9_4059_ace9_d15f9bc61053.slice/crio-conmon-224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.326029 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.538204 4755 generic.go:334] "Generic (PLEG): container finished" podID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerID="224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51" exitCode=143 Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.538492 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerDied","Data":"224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51"} Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.561175 4755 generic.go:334] "Generic (PLEG): container finished" podID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerID="4d37426d1a8768fd4036e257238c7d76cf4461d24be494ae9d42a129aeef3e5a" exitCode=143 Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.561606 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerDied","Data":"4d37426d1a8768fd4036e257238c7d76cf4461d24be494ae9d42a129aeef3e5a"} Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.590183 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.592971 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.602241 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.780687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzb6\" (UniqueName: \"kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.780868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.780954 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.865041 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.882567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.882727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzb6\" (UniqueName: \"kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.882794 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.883259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.883258 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.906085 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzb6\" (UniqueName: \"kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6\") pod \"redhat-operators-fhw68\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.922263 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:19 crc kubenswrapper[4755]: I0317 00:45:19.958942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 00:45:20 crc kubenswrapper[4755]: I0317 00:45:20.263882 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756a984f-fd52-4215-b64e-ecd7c9f2851e" path="/var/lib/kubelet/pods/756a984f-fd52-4215-b64e-ecd7c9f2851e/volumes" Mar 17 00:45:20 crc kubenswrapper[4755]: I0317 00:45:20.558944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:20 crc kubenswrapper[4755]: W0317 00:45:20.561449 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a97ba97_5bce_4644_bc0c_6ab3733dc9ef.slice/crio-8e0027a9d55a61cd50ed0bcb816fb0826d1a6a26c663c424738fe92396f171c8 WatchSource:0}: Error finding container 8e0027a9d55a61cd50ed0bcb816fb0826d1a6a26c663c424738fe92396f171c8: Status 404 returned error can't find the container with id 8e0027a9d55a61cd50ed0bcb816fb0826d1a6a26c663c424738fe92396f171c8 Mar 17 00:45:20 crc kubenswrapper[4755]: I0317 00:45:20.592099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerStarted","Data":"1ff0fddec67974ade0f0926dee9d96c60d4a676f48ff988dc3a880ae87e21e3c"} Mar 17 00:45:20 crc kubenswrapper[4755]: I0317 00:45:20.593988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerStarted","Data":"8e0027a9d55a61cd50ed0bcb816fb0826d1a6a26c663c424738fe92396f171c8"} Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.169096 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.412618 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.412838 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c9c7f6769-wqdz8" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-api" containerID="cri-o://cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f" gracePeriod=30 Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.412956 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c9c7f6769-wqdz8" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" containerID="cri-o://cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063" gracePeriod=30 Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.461362 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54c4999fb9-bx48f"] Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.471245 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.491650 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54c4999fb9-bx48f"] Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.524966 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c9c7f6769-wqdz8" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.182:9696/\": read tcp 10.217.0.2:42412->10.217.0.182:9696: read: connection reset by peer" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.611168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerStarted","Data":"c89ae1d05b84421fc636ee9a8f47ebb5072db9c7d5af89e80dcd24cbb26e9023"} Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.613526 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerID="a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94" exitCode=0 Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.613565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerDied","Data":"a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94"} Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.636726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-combined-ca-bundle\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.636782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-httpd-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.636899 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-public-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.636935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-internal-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.636957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.637000 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-ovndb-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.637031 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjtg\" (UniqueName: \"kubernetes.io/projected/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-kube-api-access-rhjtg\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-combined-ca-bundle\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738577 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-httpd-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-public-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-internal-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-ovndb-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.738822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjtg\" (UniqueName: \"kubernetes.io/projected/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-kube-api-access-rhjtg\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.744457 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.745456 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-ovndb-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.745758 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-combined-ca-bundle\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.746293 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-public-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.747068 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-internal-tls-certs\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.747696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-httpd-config\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.760109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjtg\" (UniqueName: \"kubernetes.io/projected/096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3-kube-api-access-rhjtg\") pod \"neutron-54c4999fb9-bx48f\" (UID: \"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3\") " pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:21 crc kubenswrapper[4755]: I0317 00:45:21.817689 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.454538 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54c4999fb9-bx48f"] Mar 17 00:45:22 crc kubenswrapper[4755]: W0317 00:45:22.463644 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod096dd9e7_1e02_4e3f_9bd3_f5842cb4bea3.slice/crio-2525a553312004c6994b429925b6f255d9a83d09cab02cedcb4a5180e81f5251 WatchSource:0}: Error finding container 2525a553312004c6994b429925b6f255d9a83d09cab02cedcb4a5180e81f5251: Status 404 returned error can't find the container with id 2525a553312004c6994b429925b6f255d9a83d09cab02cedcb4a5180e81f5251 Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.699754 4755 generic.go:334] "Generic (PLEG): container finished" podID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerID="cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063" exitCode=0 Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.699968 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerDied","Data":"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063"} Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.721227 4755 generic.go:334] "Generic (PLEG): container finished" podID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerID="8ff8087f65dccb70325e05d6f0c0c9881bea2368a409bd5e68d5775e4185d365" exitCode=0 Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.721310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerDied","Data":"8ff8087f65dccb70325e05d6f0c0c9881bea2368a409bd5e68d5775e4185d365"} Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.737221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c4999fb9-bx48f" event={"ID":"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3","Type":"ContainerStarted","Data":"2525a553312004c6994b429925b6f255d9a83d09cab02cedcb4a5180e81f5251"} Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.773908 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerStarted","Data":"f09df42c3e9158b4805e5413425d2afb2f2a186ef2a6ef9c9ebe75fccd1b4ff4"} Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.773953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerStarted","Data":"0de817ec17d73d7db883c458fc819fb37e4e6d4057fd5e9e184e22c259668d91"} Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.774046 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.793089 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom\") pod \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.793159 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data\") pod \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.793222 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs\") pod \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.793258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv7qb\" (UniqueName: \"kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb\") pod \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.793334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle\") pod \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\" (UID: \"16fedfbc-d1c9-4059-ace9-d15f9bc61053\") " Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.795701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs" (OuterVolumeSpecName: "logs") pod "16fedfbc-d1c9-4059-ace9-d15f9bc61053" (UID: "16fedfbc-d1c9-4059-ace9-d15f9bc61053"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.804909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16fedfbc-d1c9-4059-ace9-d15f9bc61053" (UID: "16fedfbc-d1c9-4059-ace9-d15f9bc61053"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.819699 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb" (OuterVolumeSpecName: "kube-api-access-qv7qb") pod "16fedfbc-d1c9-4059-ace9-d15f9bc61053" (UID: "16fedfbc-d1c9-4059-ace9-d15f9bc61053"). InnerVolumeSpecName "kube-api-access-qv7qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.863374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16fedfbc-d1c9-4059-ace9-d15f9bc61053" (UID: "16fedfbc-d1c9-4059-ace9-d15f9bc61053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.899058 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.899090 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fedfbc-d1c9-4059-ace9-d15f9bc61053-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.899101 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv7qb\" (UniqueName: \"kubernetes.io/projected/16fedfbc-d1c9-4059-ace9-d15f9bc61053-kube-api-access-qv7qb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.899110 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:22 crc kubenswrapper[4755]: I0317 00:45:22.902429 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data" (OuterVolumeSpecName: "config-data") pod "16fedfbc-d1c9-4059-ace9-d15f9bc61053" (UID: "16fedfbc-d1c9-4059-ace9-d15f9bc61053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.000917 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fedfbc-d1c9-4059-ace9-d15f9bc61053-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.213173 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c9c7f6769-wqdz8" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.182:9696/\": dial tcp 10.217.0.182:9696: connect: connection refused" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.788651 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c4999fb9-bx48f" event={"ID":"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3","Type":"ContainerStarted","Data":"b5e351a60b8e018971bdeb843b518e2a65e3a54b3421be531c1dec4cbe946686"} Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.788698 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c4999fb9-bx48f" event={"ID":"096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3","Type":"ContainerStarted","Data":"4646ef55bc24aa36e4fea7a6f5bf24c29f28ae90ed15a30e0853f3d7ec33bcf6"} Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.790344 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.792246 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerID="14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc" exitCode=0 Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.792294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerDied","Data":"14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc"} Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.794343 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.796378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f5dc57cd-47npv" event={"ID":"16fedfbc-d1c9-4059-ace9-d15f9bc61053","Type":"ContainerDied","Data":"da78940285286c83ff778b7bbcd9c9c3710f34ed475fe732b424607a3c30eb69"} Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.796428 4755 scope.go:117] "RemoveContainer" containerID="8ff8087f65dccb70325e05d6f0c0c9881bea2368a409bd5e68d5775e4185d365" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.796498 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f5dc57cd-47npv" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.814777 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54c4999fb9-bx48f" podStartSLOduration=2.814760342 podStartE2EDuration="2.814760342s" podCreationTimestamp="2026-03-17 00:45:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:23.81285627 +0000 UTC m=+1398.572308573" watchObservedRunningTime="2026-03-17 00:45:23.814760342 +0000 UTC m=+1398.574212635" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.844881 4755 scope.go:117] "RemoveContainer" containerID="224be010f298023e764e3653530c8da6b766380a4164441c1250b0f7973c1a51" Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.891500 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:23 crc kubenswrapper[4755]: I0317 00:45:23.905627 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57f5dc57cd-47npv"] Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.273799 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" path="/var/lib/kubelet/pods/16fedfbc-d1c9-4059-ace9-d15f9bc61053/volumes" Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.810152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerStarted","Data":"996ccff017a57ca1a61f01c25b387fb7c748546bd9b1f09ee27bbd67f891059d"} Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.811633 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.813309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerStarted","Data":"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229"} Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.845140 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.268496617 podStartE2EDuration="6.845125467s" podCreationTimestamp="2026-03-17 00:45:18 +0000 UTC" firstStartedPulling="2026-03-17 00:45:19.867048995 +0000 UTC m=+1394.626501278" lastFinishedPulling="2026-03-17 00:45:24.443677805 +0000 UTC m=+1399.203130128" observedRunningTime="2026-03-17 00:45:24.838776846 +0000 UTC m=+1399.598229139" watchObservedRunningTime="2026-03-17 00:45:24.845125467 +0000 UTC m=+1399.604577750" Mar 17 00:45:24 crc kubenswrapper[4755]: I0317 00:45:24.882534 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fhw68" podStartSLOduration=3.211775482 podStartE2EDuration="5.882519448s" podCreationTimestamp="2026-03-17 00:45:19 +0000 UTC" firstStartedPulling="2026-03-17 00:45:21.615155503 +0000 UTC m=+1396.374607786" lastFinishedPulling="2026-03-17 00:45:24.285899459 +0000 UTC m=+1399.045351752" observedRunningTime="2026-03-17 00:45:24.87849748 +0000 UTC m=+1399.637949763" watchObservedRunningTime="2026-03-17 00:45:24.882519448 +0000 UTC m=+1399.641971731" Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.071683 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.136929 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.137141 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="dnsmasq-dns" containerID="cri-o://5244ab0c426eff1a2ca9572ab94353395075a57e018a6e331aea9d5f8d2629b2" gracePeriod=10 Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.231023 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.322218 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.827633 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f456494-50c9-440c-b099-15b315cb246d" containerID="5244ab0c426eff1a2ca9572ab94353395075a57e018a6e331aea9d5f8d2629b2" exitCode=0 Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.827805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" event={"ID":"2f456494-50c9-440c-b099-15b315cb246d","Type":"ContainerDied","Data":"5244ab0c426eff1a2ca9572ab94353395075a57e018a6e331aea9d5f8d2629b2"} Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.827892 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="cinder-scheduler" containerID="cri-o://61aa9ccdf61e46cb0ffac78854f55c8625488d15a6954bd44f55dc4d5e36b178" gracePeriod=30 Mar 17 00:45:25 crc kubenswrapper[4755]: I0317 00:45:25.828035 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="probe" containerID="cri-o://fbf06f7dd428ca825fc188b7dd6aca2f4cbf042316cdd9c0fab6379f091c93f2" gracePeriod=30 Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.261359 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.284972 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.285325 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s77d\" (UniqueName: \"kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.285520 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.285687 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.285822 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.285985 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb\") pod \"2f456494-50c9-440c-b099-15b315cb246d\" (UID: \"2f456494-50c9-440c-b099-15b315cb246d\") " Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.334678 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d" (OuterVolumeSpecName: "kube-api-access-7s77d") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "kube-api-access-7s77d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.388968 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s77d\" (UniqueName: \"kubernetes.io/projected/2f456494-50c9-440c-b099-15b315cb246d-kube-api-access-7s77d\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.452485 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config" (OuterVolumeSpecName: "config") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.475651 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.483008 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.491762 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.491808 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.491819 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.499895 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.503887 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f456494-50c9-440c-b099-15b315cb246d" (UID: "2f456494-50c9-440c-b099-15b315cb246d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.593725 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.594086 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f456494-50c9-440c-b099-15b315cb246d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.839633 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" event={"ID":"2f456494-50c9-440c-b099-15b315cb246d","Type":"ContainerDied","Data":"b8c74b1b74b7b3dc064a31687a8625e30bc7dd36c4bafc45901c137a9cefc4e0"} Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.839688 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-nbd46" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.839694 4755 scope.go:117] "RemoveContainer" containerID="5244ab0c426eff1a2ca9572ab94353395075a57e018a6e331aea9d5f8d2629b2" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.869273 4755 scope.go:117] "RemoveContainer" containerID="030eeadfdbde2dcf97adc60de3d5decf4f9f6f7bc6c153a7382b08122bc054a7" Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.871508 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:26 crc kubenswrapper[4755]: I0317 00:45:26.880062 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-nbd46"] Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.444600 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.515358 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.515481 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.516192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgnfg\" (UniqueName: \"kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.516334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.516526 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.516581 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.516602 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config\") pod \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\" (UID: \"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c\") " Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.522863 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.524074 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg" (OuterVolumeSpecName: "kube-api-access-rgnfg") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "kube-api-access-rgnfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.572274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.578963 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.585960 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.587238 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config" (OuterVolumeSpecName: "config") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.604299 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" (UID: "9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619913 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619944 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgnfg\" (UniqueName: \"kubernetes.io/projected/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-kube-api-access-rgnfg\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619956 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619964 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619975 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619982 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.619989 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.851867 4755 generic.go:334] "Generic (PLEG): container finished" podID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerID="cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f" exitCode=0 Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.851925 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerDied","Data":"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f"} Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.851947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c9c7f6769-wqdz8" event={"ID":"9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c","Type":"ContainerDied","Data":"b3934471cfc14bbc60a6409f5bd09aa29948d6ce09f22b23377314bc5d42246f"} Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.851964 4755 scope.go:117] "RemoveContainer" containerID="cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.852031 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c9c7f6769-wqdz8" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.855297 4755 generic.go:334] "Generic (PLEG): container finished" podID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerID="fbf06f7dd428ca825fc188b7dd6aca2f4cbf042316cdd9c0fab6379f091c93f2" exitCode=0 Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.855327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerDied","Data":"fbf06f7dd428ca825fc188b7dd6aca2f4cbf042316cdd9c0fab6379f091c93f2"} Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.885190 4755 scope.go:117] "RemoveContainer" containerID="cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.913026 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.920570 4755 scope.go:117] "RemoveContainer" containerID="cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.920693 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 17 00:45:27 crc kubenswrapper[4755]: E0317 00:45:27.921486 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063\": container with ID starting with cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063 not found: ID does not exist" containerID="cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.921518 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063"} err="failed to get container status \"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063\": rpc error: code = NotFound desc = could not find container \"cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063\": container with ID starting with cea18063de57a4df658846bca1217e0e32d576007cf94e1c19149387f4f61063 not found: ID does not exist" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.921538 4755 scope.go:117] "RemoveContainer" containerID="cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f" Mar 17 00:45:27 crc kubenswrapper[4755]: E0317 00:45:27.921969 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f\": container with ID starting with cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f not found: ID does not exist" containerID="cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.921986 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f"} err="failed to get container status \"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f\": rpc error: code = NotFound desc = could not find container \"cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f\": container with ID starting with cc0b7779412c1b389c33e1b1cb2bf9ce8d07899ce52e202204ccb9bc3f30d08f not found: ID does not exist" Mar 17 00:45:27 crc kubenswrapper[4755]: I0317 00:45:27.936377 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c9c7f6769-wqdz8"] Mar 17 00:45:28 crc kubenswrapper[4755]: I0317 00:45:28.258509 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f456494-50c9-440c-b099-15b315cb246d" path="/var/lib/kubelet/pods/2f456494-50c9-440c-b099-15b315cb246d/volumes" Mar 17 00:45:28 crc kubenswrapper[4755]: I0317 00:45:28.259244 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" path="/var/lib/kubelet/pods/9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c/volumes" Mar 17 00:45:28 crc kubenswrapper[4755]: I0317 00:45:28.665703 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:45:28 crc kubenswrapper[4755]: I0317 00:45:28.665790 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:45:29 crc kubenswrapper[4755]: I0317 00:45:29.887127 4755 generic.go:334] "Generic (PLEG): container finished" podID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerID="61aa9ccdf61e46cb0ffac78854f55c8625488d15a6954bd44f55dc4d5e36b178" exitCode=0 Mar 17 00:45:29 crc kubenswrapper[4755]: I0317 00:45:29.887196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerDied","Data":"61aa9ccdf61e46cb0ffac78854f55c8625488d15a6954bd44f55dc4d5e36b178"} Mar 17 00:45:29 crc kubenswrapper[4755]: I0317 00:45:29.923680 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:29 crc kubenswrapper[4755]: I0317 00:45:29.923726 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.276659 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371428 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9qf\" (UniqueName: \"kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371586 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371721 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.371781 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data\") pod \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\" (UID: \"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d\") " Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.377587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.384901 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts" (OuterVolumeSpecName: "scripts") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.384987 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.385134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf" (OuterVolumeSpecName: "kube-api-access-2c9qf") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "kube-api-access-2c9qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.423807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.474722 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9qf\" (UniqueName: \"kubernetes.io/projected/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-kube-api-access-2c9qf\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.474750 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.474762 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.474771 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.474780 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.503617 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data" (OuterVolumeSpecName: "config-data") pod "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" (UID: "759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.576723 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.897190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d","Type":"ContainerDied","Data":"1d9ba7363234a556f5adfeed68e979b92e35a8f3572199c82f6cdb135de2f580"} Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.897306 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.897794 4755 scope.go:117] "RemoveContainer" containerID="fbf06f7dd428ca825fc188b7dd6aca2f4cbf042316cdd9c0fab6379f091c93f2" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.921552 4755 scope.go:117] "RemoveContainer" containerID="61aa9ccdf61e46cb0ffac78854f55c8625488d15a6954bd44f55dc4d5e36b178" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.938298 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.947085 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978097 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978590 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978607 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978624 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="cinder-scheduler" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978630 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="cinder-scheduler" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978644 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="init" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978650 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="init" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978661 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api-log" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978667 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api-log" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978684 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978690 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978701 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="probe" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978707 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="probe" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978722 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="dnsmasq-dns" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978728 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="dnsmasq-dns" Mar 17 00:45:30 crc kubenswrapper[4755]: E0317 00:45:30.978744 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-api" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978751 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-api" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978926 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978939 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-api" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978947 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="cinder-scheduler" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978961 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fedfbc-d1c9-4059-ace9-d15f9bc61053" containerName="barbican-api-log" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978972 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f456494-50c9-440c-b099-15b315cb246d" containerName="dnsmasq-dns" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978987 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" containerName="probe" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.978997 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9929a6e1-b83a-4fc4-b904-9a3acc9d2c5c" containerName="neutron-httpd" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.980052 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.982270 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984469 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984507 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984694 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmjf\" (UniqueName: \"kubernetes.io/projected/5201f678-3b17-4d85-b341-2f789377dbaa-kube-api-access-2vmjf\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5201f678-3b17-4d85-b341-2f789377dbaa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.984859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:30 crc kubenswrapper[4755]: I0317 00:45:30.994706 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.001604 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fhw68" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="registry-server" probeResult="failure" output=< Mar 17 00:45:31 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:45:31 crc kubenswrapper[4755]: > Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmjf\" (UniqueName: \"kubernetes.io/projected/5201f678-3b17-4d85-b341-2f789377dbaa-kube-api-access-2vmjf\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086528 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5201f678-3b17-4d85-b341-2f789377dbaa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086583 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.086688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.087297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5201f678-3b17-4d85-b341-2f789377dbaa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.091103 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.091734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.093189 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.102581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5201f678-3b17-4d85-b341-2f789377dbaa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.111805 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmjf\" (UniqueName: \"kubernetes.io/projected/5201f678-3b17-4d85-b341-2f789377dbaa-kube-api-access-2vmjf\") pod \"cinder-scheduler-0\" (UID: \"5201f678-3b17-4d85-b341-2f789377dbaa\") " pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.294804 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.859813 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 00:45:31 crc kubenswrapper[4755]: I0317 00:45:31.917295 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5201f678-3b17-4d85-b341-2f789377dbaa","Type":"ContainerStarted","Data":"8803c9070e386368ef955149ad078f4e79893fa93b060a23c51383c565b57b3f"} Mar 17 00:45:32 crc kubenswrapper[4755]: I0317 00:45:32.259740 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d" path="/var/lib/kubelet/pods/759de4a2-5c31-4f46-9fdd-21d8ed1a4a9d/volumes" Mar 17 00:45:32 crc kubenswrapper[4755]: I0317 00:45:32.952713 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5201f678-3b17-4d85-b341-2f789377dbaa","Type":"ContainerStarted","Data":"d99e0587dcea39a79ba8f197b393a15db380190b84f2f95f7461924eff60b795"} Mar 17 00:45:33 crc kubenswrapper[4755]: I0317 00:45:33.954664 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:33 crc kubenswrapper[4755]: I0317 00:45:33.955453 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:33 crc kubenswrapper[4755]: I0317 00:45:33.963177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5201f678-3b17-4d85-b341-2f789377dbaa","Type":"ContainerStarted","Data":"c01b25a543df05d425bbc9ac5aa8c0bfedf17db5f62768edfb77571ba8af6cc5"} Mar 17 00:45:33 crc kubenswrapper[4755]: I0317 00:45:33.994126 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.994108621 podStartE2EDuration="3.994108621s" podCreationTimestamp="2026-03-17 00:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:33.98923752 +0000 UTC m=+1408.748689803" watchObservedRunningTime="2026-03-17 00:45:33.994108621 +0000 UTC m=+1408.753560904" Mar 17 00:45:34 crc kubenswrapper[4755]: I0317 00:45:34.193607 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:34 crc kubenswrapper[4755]: I0317 00:45:34.390874 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-57d7fc6d98-smgzp" Mar 17 00:45:34 crc kubenswrapper[4755]: I0317 00:45:34.531512 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f66764f8d-z7959" Mar 17 00:45:34 crc kubenswrapper[4755]: I0317 00:45:34.591214 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.185836 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.187498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.190178 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.190208 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.191103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r8bh7" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.199195 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.285781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.285975 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.286054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr9mj\" (UniqueName: \"kubernetes.io/projected/007a5062-42e0-47ac-9523-a4d486614f70-kube-api-access-lr9mj\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.286077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config-secret\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.388397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.388536 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.388597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr9mj\" (UniqueName: \"kubernetes.io/projected/007a5062-42e0-47ac-9523-a4d486614f70-kube-api-access-lr9mj\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.388624 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config-secret\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.390798 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.395334 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-openstack-config-secret\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.395510 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a5062-42e0-47ac-9523-a4d486614f70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.405780 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr9mj\" (UniqueName: \"kubernetes.io/projected/007a5062-42e0-47ac-9523-a4d486614f70-kube-api-access-lr9mj\") pod \"openstackclient\" (UID: \"007a5062-42e0-47ac-9523-a4d486614f70\") " pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.514204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.981410 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55ff8ddfc6-wbxxp" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-log" containerID="cri-o://e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568" gracePeriod=30 Mar 17 00:45:35 crc kubenswrapper[4755]: I0317 00:45:35.981831 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55ff8ddfc6-wbxxp" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-api" containerID="cri-o://aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c" gracePeriod=30 Mar 17 00:45:36 crc kubenswrapper[4755]: I0317 00:45:36.007301 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 00:45:36 crc kubenswrapper[4755]: W0317 00:45:36.015569 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007a5062_42e0_47ac_9523_a4d486614f70.slice/crio-1a9d986eb53e90c032d36922b514ab46984d9c3f1fe7c1562685bde6d7f84609 WatchSource:0}: Error finding container 1a9d986eb53e90c032d36922b514ab46984d9c3f1fe7c1562685bde6d7f84609: Status 404 returned error can't find the container with id 1a9d986eb53e90c032d36922b514ab46984d9c3f1fe7c1562685bde6d7f84609 Mar 17 00:45:36 crc kubenswrapper[4755]: I0317 00:45:36.295886 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 00:45:36 crc kubenswrapper[4755]: I0317 00:45:36.995609 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"007a5062-42e0-47ac-9523-a4d486614f70","Type":"ContainerStarted","Data":"1a9d986eb53e90c032d36922b514ab46984d9c3f1fe7c1562685bde6d7f84609"} Mar 17 00:45:36 crc kubenswrapper[4755]: I0317 00:45:36.998579 4755 generic.go:334] "Generic (PLEG): container finished" podID="820d6f40-e3d1-4675-868b-7432d4b65006" containerID="e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568" exitCode=143 Mar 17 00:45:36 crc kubenswrapper[4755]: I0317 00:45:36.998633 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerDied","Data":"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568"} Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.739413 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.749000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.752020 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-srx5n" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.752062 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.754608 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.766780 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.866508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.866599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.866896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.866999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl9m2\" (UniqueName: \"kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.868244 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.909250 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.909374 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.934234 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.937187 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.938835 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.952610 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.955381 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.957065 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.966182 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968831 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968895 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968919 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968957 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v42z\" (UniqueName: \"kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.968989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969106 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88lv\" (UniqueName: \"kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969355 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969384 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.969420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl9m2\" (UniqueName: \"kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.981975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.984319 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:38 crc kubenswrapper[4755]: I0317 00:45:38.993137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.003311 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl9m2\" (UniqueName: \"kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2\") pod \"heat-engine-5db85cdfc7-s8bqf\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.020215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070778 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.070960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071008 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071050 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v42z\" (UniqueName: \"kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtcxg\" (UniqueName: \"kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.071153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v88lv\" (UniqueName: \"kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.072096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.072589 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.073899 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.074745 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.075519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.078132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.083389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.093792 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.105062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.125131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v42z\" (UniqueName: \"kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z\") pod \"heat-api-5f5487f5bb-dv7gk\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.140397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88lv\" (UniqueName: \"kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv\") pod \"dnsmasq-dns-f6bc4c6c9-xbllf\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.173796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtcxg\" (UniqueName: \"kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.173904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.173930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.173947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.179150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.198146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.223296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.225336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtcxg\" (UniqueName: \"kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg\") pod \"heat-cfnapi-5699988bfc-7hbpj\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.264327 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.266134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.318601 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.853791 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:45:39 crc kubenswrapper[4755]: I0317 00:45:39.975246 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.002517 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.070270 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5db85cdfc7-s8bqf" event={"ID":"da85eed8-63ff-4f04-aefa-13d60b8606f8","Type":"ContainerStarted","Data":"c1dc00644b048b6411c18bf7a24c75cace3c54618859771b3282fb6dfcbdabc2"} Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.074307 4755 generic.go:334] "Generic (PLEG): container finished" podID="820d6f40-e3d1-4675-868b-7432d4b65006" containerID="aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c" exitCode=0 Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.074357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerDied","Data":"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c"} Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.074383 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55ff8ddfc6-wbxxp" event={"ID":"820d6f40-e3d1-4675-868b-7432d4b65006","Type":"ContainerDied","Data":"39d308770b5677998acf57d2d1aa89391d1f3815ef9ed058c2b52d43ef5eee3a"} Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.074399 4755 scope.go:117] "RemoveContainer" containerID="aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.074744 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55ff8ddfc6-wbxxp" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.104826 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.109814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.109898 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.110013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.110076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9dlj\" (UniqueName: \"kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.110138 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.110162 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.110214 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts\") pod \"820d6f40-e3d1-4675-868b-7432d4b65006\" (UID: \"820d6f40-e3d1-4675-868b-7432d4b65006\") " Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.112681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs" (OuterVolumeSpecName: "logs") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.120321 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts" (OuterVolumeSpecName: "scripts") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.122250 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj" (OuterVolumeSpecName: "kube-api-access-l9dlj") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "kube-api-access-l9dlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.148107 4755 scope.go:117] "RemoveContainer" containerID="e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.149515 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.158937 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.192974 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.218334 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data" (OuterVolumeSpecName: "config-data") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.221666 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9dlj\" (UniqueName: \"kubernetes.io/projected/820d6f40-e3d1-4675-868b-7432d4b65006-kube-api-access-l9dlj\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.221888 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.221996 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.222117 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820d6f40-e3d1-4675-868b-7432d4b65006-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.253450 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.323718 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.327251 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.352772 4755 scope.go:117] "RemoveContainer" containerID="aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c" Mar 17 00:45:40 crc kubenswrapper[4755]: E0317 00:45:40.354923 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c\": container with ID starting with aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c not found: ID does not exist" containerID="aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.354957 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c"} err="failed to get container status \"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c\": rpc error: code = NotFound desc = could not find container \"aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c\": container with ID starting with aa54add94c1142e3128938b34e38c3077c498a18264f5c560848b329539f5f7c not found: ID does not exist" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.354978 4755 scope.go:117] "RemoveContainer" containerID="e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568" Mar 17 00:45:40 crc kubenswrapper[4755]: E0317 00:45:40.355304 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568\": container with ID starting with e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568 not found: ID does not exist" containerID="e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.355326 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568"} err="failed to get container status \"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568\": rpc error: code = NotFound desc = could not find container \"e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568\": container with ID starting with e327396f236ac105997c2095af14ff7d7be015d0d51fdd91e9f57cffa85d2568 not found: ID does not exist" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.385144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.402134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "820d6f40-e3d1-4675-868b-7432d4b65006" (UID: "820d6f40-e3d1-4675-868b-7432d4b65006"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.426535 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.427142 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/820d6f40-e3d1-4675-868b-7432d4b65006-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.722546 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:40 crc kubenswrapper[4755]: I0317 00:45:40.736706 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55ff8ddfc6-wbxxp"] Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.090530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5db85cdfc7-s8bqf" event={"ID":"da85eed8-63ff-4f04-aefa-13d60b8606f8","Type":"ContainerStarted","Data":"6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38"} Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.091673 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.095395 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" event={"ID":"be10b2f6-c553-40d0-968d-e484111525bc","Type":"ContainerStarted","Data":"9e8e480f84d190f254ec33ca42af899088c440c8896d36a55bedf6b0a2aa3f56"} Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.097923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5487f5bb-dv7gk" event={"ID":"9033d600-fe8b-42b1-ac25-2f683c0b0f5e","Type":"ContainerStarted","Data":"a43f9f5a9b2afa2a4c9ea950a6f3c5fb670210d881954d5cf5db3e4d99d5da93"} Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.099213 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerID="1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346" exitCode=0 Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.099368 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fhw68" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="registry-server" containerID="cri-o://9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229" gracePeriod=2 Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.099367 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" event={"ID":"d5f0edaa-22b4-4862-b0f1-c6dfef316566","Type":"ContainerDied","Data":"1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346"} Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.099636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" event={"ID":"d5f0edaa-22b4-4862-b0f1-c6dfef316566","Type":"ContainerStarted","Data":"4b6701facf5b2f39da611c94899fda873e75eba69937bee35011d3614eaa123f"} Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.136243 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5db85cdfc7-s8bqf" podStartSLOduration=3.136222449 podStartE2EDuration="3.136222449s" podCreationTimestamp="2026-03-17 00:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:41.114768954 +0000 UTC m=+1415.874221247" watchObservedRunningTime="2026-03-17 00:45:41.136222449 +0000 UTC m=+1415.895674732" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.572797 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.594522 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.764149 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content\") pod \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.764238 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities\") pod \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.764334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvzb6\" (UniqueName: \"kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6\") pod \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\" (UID: \"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef\") " Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.765421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities" (OuterVolumeSpecName: "utilities") pod "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" (UID: "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.774600 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6" (OuterVolumeSpecName: "kube-api-access-pvzb6") pod "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" (UID: "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef"). InnerVolumeSpecName "kube-api-access-pvzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.868685 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.868721 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvzb6\" (UniqueName: \"kubernetes.io/projected/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-kube-api-access-pvzb6\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.920370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" (UID: "7a97ba97-5bce-4644-bc0c-6ab3733dc9ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:41 crc kubenswrapper[4755]: I0317 00:45:41.973752 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053471 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b9b5bb667-6pk7q"] Mar 17 00:45:42 crc kubenswrapper[4755]: E0317 00:45:42.053870 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="extract-content" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053887 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="extract-content" Mar 17 00:45:42 crc kubenswrapper[4755]: E0317 00:45:42.053907 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-api" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053914 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-api" Mar 17 00:45:42 crc kubenswrapper[4755]: E0317 00:45:42.053922 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="extract-utilities" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053928 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="extract-utilities" Mar 17 00:45:42 crc kubenswrapper[4755]: E0317 00:45:42.053945 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-log" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053950 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-log" Mar 17 00:45:42 crc kubenswrapper[4755]: E0317 00:45:42.053962 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="registry-server" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.053968 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="registry-server" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.054149 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-api" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.054170 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" containerName="placement-log" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.054183 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerName="registry-server" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.055188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.096380 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097293 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-public-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-etc-swift\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097349 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-log-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-internal-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097422 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-config-data\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-run-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-combined-ca-bundle\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.097844 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6fn\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-kube-api-access-qn6fn\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.099797 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.099805 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.121210 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b9b5bb667-6pk7q"] Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.156014 4755 generic.go:334] "Generic (PLEG): container finished" podID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" containerID="9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229" exitCode=0 Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.156087 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerDied","Data":"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229"} Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.156116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhw68" event={"ID":"7a97ba97-5bce-4644-bc0c-6ab3733dc9ef","Type":"ContainerDied","Data":"8e0027a9d55a61cd50ed0bcb816fb0826d1a6a26c663c424738fe92396f171c8"} Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.156134 4755 scope.go:117] "RemoveContainer" containerID="9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.156243 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhw68" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.175254 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" event={"ID":"d5f0edaa-22b4-4862-b0f1-c6dfef316566","Type":"ContainerStarted","Data":"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076"} Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.175609 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.200725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-run-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.200793 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-combined-ca-bundle\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.203652 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-run-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.200892 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6fn\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-kube-api-access-qn6fn\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.204542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-public-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.204599 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-etc-swift\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.204687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-log-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.205003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-internal-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.205245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-config-data\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.207345 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-combined-ca-bundle\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.215235 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" podStartSLOduration=4.215213476 podStartE2EDuration="4.215213476s" podCreationTimestamp="2026-03-17 00:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:42.195837787 +0000 UTC m=+1416.955290080" watchObservedRunningTime="2026-03-17 00:45:42.215213476 +0000 UTC m=+1416.974665759" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.229514 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-config-data\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.229775 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfa93106-8e0c-4e7d-93cf-33d06c85d883-log-httpd\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.231554 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-public-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.232705 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-etc-swift\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.236599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6fn\" (UniqueName: \"kubernetes.io/projected/cfa93106-8e0c-4e7d-93cf-33d06c85d883-kube-api-access-qn6fn\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.239570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfa93106-8e0c-4e7d-93cf-33d06c85d883-internal-tls-certs\") pod \"swift-proxy-5b9b5bb667-6pk7q\" (UID: \"cfa93106-8e0c-4e7d-93cf-33d06c85d883\") " pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.287194 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820d6f40-e3d1-4675-868b-7432d4b65006" path="/var/lib/kubelet/pods/820d6f40-e3d1-4675-868b-7432d4b65006/volumes" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.287920 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.287949 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fhw68"] Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.336399 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.336900 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" containerID="cri-o://996ccff017a57ca1a61f01c25b387fb7c748546bd9b1f09ee27bbd67f891059d" gracePeriod=30 Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.337253 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-notification-agent" containerID="cri-o://f09df42c3e9158b4805e5413425d2afb2f2a186ef2a6ef9c9ebe75fccd1b4ff4" gracePeriod=30 Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.337288 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="sg-core" containerID="cri-o://0de817ec17d73d7db883c458fc819fb37e4e6d4057fd5e9e184e22c259668d91" gracePeriod=30 Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.336828 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-central-agent" containerID="cri-o://c89ae1d05b84421fc636ee9a8f47ebb5072db9c7d5af89e80dcd24cbb26e9023" gracePeriod=30 Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.349417 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.197:3000/\": EOF" Mar 17 00:45:42 crc kubenswrapper[4755]: I0317 00:45:42.406775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.190779 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerID="996ccff017a57ca1a61f01c25b387fb7c748546bd9b1f09ee27bbd67f891059d" exitCode=0 Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.191092 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerID="0de817ec17d73d7db883c458fc819fb37e4e6d4057fd5e9e184e22c259668d91" exitCode=2 Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.191104 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerID="c89ae1d05b84421fc636ee9a8f47ebb5072db9c7d5af89e80dcd24cbb26e9023" exitCode=0 Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.190854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerDied","Data":"996ccff017a57ca1a61f01c25b387fb7c748546bd9b1f09ee27bbd67f891059d"} Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.191227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerDied","Data":"0de817ec17d73d7db883c458fc819fb37e4e6d4057fd5e9e184e22c259668d91"} Mar 17 00:45:43 crc kubenswrapper[4755]: I0317 00:45:43.191247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerDied","Data":"c89ae1d05b84421fc636ee9a8f47ebb5072db9c7d5af89e80dcd24cbb26e9023"} Mar 17 00:45:44 crc kubenswrapper[4755]: I0317 00:45:44.203457 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerID="f09df42c3e9158b4805e5413425d2afb2f2a186ef2a6ef9c9ebe75fccd1b4ff4" exitCode=0 Mar 17 00:45:44 crc kubenswrapper[4755]: I0317 00:45:44.203594 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerDied","Data":"f09df42c3e9158b4805e5413425d2afb2f2a186ef2a6ef9c9ebe75fccd1b4ff4"} Mar 17 00:45:44 crc kubenswrapper[4755]: I0317 00:45:44.265162 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a97ba97-5bce-4644-bc0c-6ab3733dc9ef" path="/var/lib/kubelet/pods/7a97ba97-5bce-4644-bc0c-6ab3733dc9ef/volumes" Mar 17 00:45:44 crc kubenswrapper[4755]: I0317 00:45:44.350549 4755 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2315f493-9035-4185-b615-e7eed6a246ea"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2315f493-9035-4185-b615-e7eed6a246ea] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2315f493_9035_4185_b615_e7eed6a246ea.slice" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.463490 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.465039 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.475226 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.485210 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.486648 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.501027 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.502357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.524205 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.552529 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.596678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.596746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfzk5\" (UniqueName: \"kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.596799 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.596853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597016 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5hr\" (UniqueName: \"kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597297 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddskj\" (UniqueName: \"kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597430 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.597494 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699396 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699425 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddskj\" (UniqueName: \"kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699473 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699505 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfzk5\" (UniqueName: \"kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699650 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.699760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5hr\" (UniqueName: \"kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.708086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.708955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.709117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.709318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.711028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.711800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.717773 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.718755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.720882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.726920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5hr\" (UniqueName: \"kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr\") pod \"heat-engine-644dcb55b6-q7jd4\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.727056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfzk5\" (UniqueName: \"kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5\") pod \"heat-api-6dc587f546-7lvlg\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.731955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddskj\" (UniqueName: \"kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj\") pod \"heat-cfnapi-7cf59db8d9-lkrks\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.785907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.801176 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:45 crc kubenswrapper[4755]: I0317 00:45:45.828489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.155883 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hrkzh"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.157303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.171166 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hrkzh"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.209287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.209494 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv48d\" (UniqueName: \"kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.269470 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-h9477"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.270841 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.292904 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h9477"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.311003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv48d\" (UniqueName: \"kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.311067 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.311147 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.313139 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhrj\" (UniqueName: \"kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.314056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.330180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv48d\" (UniqueName: \"kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d\") pod \"nova-api-db-create-hrkzh\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.370328 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8bgbn"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.371857 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.399716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8bgbn"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.408422 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-571c-account-create-update-vrrc6"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.409998 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.413011 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.414582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.415230 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.415395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9lff\" (UniqueName: \"kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.415542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhrj\" (UniqueName: \"kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.415628 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.441275 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-571c-account-create-update-vrrc6"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.450960 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhrj\" (UniqueName: \"kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj\") pod \"nova-cell0-db-create-h9477\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.482303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.516940 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.517210 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk25f\" (UniqueName: \"kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.517327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9lff\" (UniqueName: \"kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.517377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.518357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.541207 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9lff\" (UniqueName: \"kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff\") pod \"nova-cell1-db-create-8bgbn\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.575520 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3bde-account-create-update-4xt5d"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.576737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.578945 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.593739 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3bde-account-create-update-4xt5d"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.594168 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.618702 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.618812 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6m4z\" (UniqueName: \"kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.618861 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.618986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk25f\" (UniqueName: \"kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.619835 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.641031 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk25f\" (UniqueName: \"kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f\") pod \"nova-api-571c-account-create-update-vrrc6\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.704721 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.721032 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.721116 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6m4z\" (UniqueName: \"kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.722099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.724500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.735773 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6m4z\" (UniqueName: \"kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z\") pod \"nova-cell0-3bde-account-create-update-4xt5d\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.764582 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-16ab-account-create-update-jd78k"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.765971 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.769762 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.796783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-16ab-account-create-update-jd78k"] Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.822520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svxkb\" (UniqueName: \"kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.822592 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.915936 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.925648 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svxkb\" (UniqueName: \"kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.925713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.926510 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:46 crc kubenswrapper[4755]: I0317 00:45:46.942470 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svxkb\" (UniqueName: \"kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb\") pod \"nova-cell1-16ab-account-create-update-jd78k\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.153771 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.921233 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.961321 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.962727 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.965422 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 17 00:45:47 crc kubenswrapper[4755]: I0317 00:45:47.985556 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.011407 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.020259 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.093154 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.094632 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.097049 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.103881 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.113630 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.159986 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160073 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160194 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxpk\" (UniqueName: \"kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.160313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzxt\" (UniqueName: \"kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264660 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzxt\" (UniqueName: \"kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264793 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264919 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxpk\" (UniqueName: \"kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.264939 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.270885 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.272122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.272362 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.272430 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.273941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.278367 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.278580 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.278643 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.282593 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.283516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzxt\" (UniqueName: \"kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.283672 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxpk\" (UniqueName: \"kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk\") pod \"heat-cfnapi-958555d6d-xlr8p\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.297274 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom\") pod \"heat-api-7f4c59689b-88k7q\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.306719 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:48 crc kubenswrapper[4755]: I0317 00:45:48.421880 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.259566 4755 generic.go:334] "Generic (PLEG): container finished" podID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerID="89089a92aee5b16244b5d778ae3b08a05884cda0f40fa0d8a6cb63aba65f19bf" exitCode=137 Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.259887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerDied","Data":"89089a92aee5b16244b5d778ae3b08a05884cda0f40fa0d8a6cb63aba65f19bf"} Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.269669 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.327130 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.197:3000/\": dial tcp 10.217.0.197:3000: connect: connection refused" Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.386322 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:49 crc kubenswrapper[4755]: I0317 00:45:49.386565 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="dnsmasq-dns" containerID="cri-o://74f0fc9e6a3128782b59a3edd43d8e081bcc3e5e9b9505e3efc8d69d114df5a5" gracePeriod=10 Mar 17 00:45:50 crc kubenswrapper[4755]: I0317 00:45:50.070142 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Mar 17 00:45:50 crc kubenswrapper[4755]: I0317 00:45:50.262202 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.196:8776/healthcheck\": dial tcp 10.217.0.196:8776: connect: connection refused" Mar 17 00:45:50 crc kubenswrapper[4755]: I0317 00:45:50.273204 4755 generic.go:334] "Generic (PLEG): container finished" podID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerID="74f0fc9e6a3128782b59a3edd43d8e081bcc3e5e9b9505e3efc8d69d114df5a5" exitCode=0 Mar 17 00:45:50 crc kubenswrapper[4755]: I0317 00:45:50.273282 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerDied","Data":"74f0fc9e6a3128782b59a3edd43d8e081bcc3e5e9b9505e3efc8d69d114df5a5"} Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.077781 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-api:current-podified" Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.077964 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-api,Image:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_httpd_setup && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596hbdh5cfhf5hbch66fh584h59bh5f9h669hdfh54fhbch5ddhf4h5fbh5ch568hdch565h7bh5c4h558h65h68chf7hffh5b5h68bh589h566h566q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:heat-api-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-custom,ReadOnly:true,MountPath:/etc/heat/heat.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5v42z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthcheck,Port:{0 8004 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:10,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthcheck,Port:{0 8004 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:10,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-api-5f5487f5bb-dv7gk_openstack(9033d600-fe8b-42b1-ac25-2f683c0b0f5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.079159 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-api-5f5487f5bb-dv7gk" podUID="9033d600-fe8b-42b1-ac25-2f683c0b0f5e" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.126311 4755 scope.go:117] "RemoveContainer" containerID="14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.500010 4755 scope.go:117] "RemoveContainer" containerID="a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.649996 4755 scope.go:117] "RemoveContainer" containerID="9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229" Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.656569 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229\": container with ID starting with 9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229 not found: ID does not exist" containerID="9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.656617 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229"} err="failed to get container status \"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229\": rpc error: code = NotFound desc = could not find container \"9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229\": container with ID starting with 9f19c5669b155ffc068ac6db0febef29feabf0857efbe196c2ff8c44f258a229 not found: ID does not exist" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.656644 4755 scope.go:117] "RemoveContainer" containerID="14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc" Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.657581 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc\": container with ID starting with 14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc not found: ID does not exist" containerID="14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.657620 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc"} err="failed to get container status \"14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc\": rpc error: code = NotFound desc = could not find container \"14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc\": container with ID starting with 14e05107095f77ccf7ef298a739ef1104659da91cbe7a5a53359ebb90023e3bc not found: ID does not exist" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.657647 4755 scope.go:117] "RemoveContainer" containerID="a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94" Mar 17 00:45:51 crc kubenswrapper[4755]: E0317 00:45:51.657940 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94\": container with ID starting with a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94 not found: ID does not exist" containerID="a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.657964 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94"} err="failed to get container status \"a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94\": rpc error: code = NotFound desc = could not find container \"a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94\": container with ID starting with a7cb5e45955d7800a1a551b3beea2c91312c324260ad45df01de3162b70bba94 not found: ID does not exist" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.890202 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54c4999fb9-bx48f" Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.978605 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.978832 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-df477f8d4-2cfqn" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-api" containerID="cri-o://317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611" gracePeriod=30 Mar 17 00:45:51 crc kubenswrapper[4755]: I0317 00:45:51.979290 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-df477f8d4-2cfqn" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-httpd" containerID="cri-o://b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f" gracePeriod=30 Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.025364 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.163525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.163914 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.163936 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.164086 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b897b\" (UniqueName: \"kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.164128 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.164234 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.164370 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.164531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts\") pod \"f9f638d5-e4d3-4678-b597-cc6a84235d22\" (UID: \"f9f638d5-e4d3-4678-b597-cc6a84235d22\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.165458 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs" (OuterVolumeSpecName: "logs") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.166186 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9f638d5-e4d3-4678-b597-cc6a84235d22-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.166271 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9f638d5-e4d3-4678-b597-cc6a84235d22-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.171591 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.179090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts" (OuterVolumeSpecName: "scripts") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.197887 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b" (OuterVolumeSpecName: "kube-api-access-b897b") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "kube-api-access-b897b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.231279 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.246059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data" (OuterVolumeSpecName: "config-data") pod "f9f638d5-e4d3-4678-b597-cc6a84235d22" (UID: "f9f638d5-e4d3-4678-b597-cc6a84235d22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.269231 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.269778 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.269822 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.269837 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b897b\" (UniqueName: \"kubernetes.io/projected/f9f638d5-e4d3-4678-b597-cc6a84235d22-kube-api-access-b897b\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.269850 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9f638d5-e4d3-4678-b597-cc6a84235d22-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.324856 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.351646 4755 generic.go:334] "Generic (PLEG): container finished" podID="ab92471f-7493-4924-a5d7-c194d62e821f" containerID="b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f" exitCode=0 Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.353843 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f9f638d5-e4d3-4678-b597-cc6a84235d22","Type":"ContainerDied","Data":"227a22fefd34bb5c38393147ae0973d567655a6a60917735363dc9095a087a51"} Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.353879 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerDied","Data":"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f"} Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.353900 4755 scope.go:117] "RemoveContainer" containerID="89089a92aee5b16244b5d778ae3b08a05884cda0f40fa0d8a6cb63aba65f19bf" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.372106 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" event={"ID":"f270f16c-6af5-41f6-872d-d46aebe04b6e","Type":"ContainerDied","Data":"918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93"} Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.372139 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918015ebbf5f8456448a603fa5f3bce63d12dd27195187983490b17882bffc93" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.384282 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.412365 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b9fe8c0-56e7-487a-b68a-905789266b31","Type":"ContainerDied","Data":"1ff0fddec67974ade0f0926dee9d96c60d4a676f48ff988dc3a880ae87e21e3c"} Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.434462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f5487f5bb-dv7gk" event={"ID":"9033d600-fe8b-42b1-ac25-2f683c0b0f5e","Type":"ContainerDied","Data":"a43f9f5a9b2afa2a4c9ea950a6f3c5fb670210d881954d5cf5db3e4d99d5da93"} Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.434494 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a43f9f5a9b2afa2a4c9ea950a6f3c5fb670210d881954d5cf5db3e4d99d5da93" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.434636 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.459536 4755 scope.go:117] "RemoveContainer" containerID="4d37426d1a8768fd4036e257238c7d76cf4461d24be494ae9d42a129aeef3e5a" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.459683 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.482735 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.531696 4755 scope.go:117] "RemoveContainer" containerID="996ccff017a57ca1a61f01c25b387fb7c748546bd9b1f09ee27bbd67f891059d" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.575754 4755 scope.go:117] "RemoveContainer" containerID="0de817ec17d73d7db883c458fc819fb37e4e6d4057fd5e9e184e22c259668d91" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.582154 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592332 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle\") pod \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592356 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom\") pod \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592488 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v42z\" (UniqueName: \"kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z\") pod \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtr6d\" (UniqueName: \"kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592591 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592634 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592700 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592735 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data\") pod \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\" (UID: \"9033d600-fe8b-42b1-ac25-2f683c0b0f5e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592761 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592799 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc\") pod \"f270f16c-6af5-41f6-872d-d46aebe04b6e\" (UID: \"f270f16c-6af5-41f6-872d-d46aebe04b6e\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592823 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9vm\" (UniqueName: \"kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.592863 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle\") pod \"9b9fe8c0-56e7-487a-b68a-905789266b31\" (UID: \"9b9fe8c0-56e7-487a-b68a-905789266b31\") " Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593132 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593584 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593600 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593613 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="init" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593619 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="init" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593634 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593640 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593652 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api-log" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593658 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api-log" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593667 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-central-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593672 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-central-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593688 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="dnsmasq-dns" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593693 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="dnsmasq-dns" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593704 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-notification-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593710 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-notification-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: E0317 00:45:52.593723 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="sg-core" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.593728 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="sg-core" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594124 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594150 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="sg-core" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594161 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-notification-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594169 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" containerName="cinder-api-log" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594181 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="proxy-httpd" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594191 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" containerName="dnsmasq-dns" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.594201 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" containerName="ceilometer-central-agent" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.595320 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.598239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d" (OuterVolumeSpecName: "kube-api-access-dtr6d") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "kube-api-access-dtr6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.605055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts" (OuterVolumeSpecName: "scripts") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.607800 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.609318 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.611280 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.613613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data" (OuterVolumeSpecName: "config-data") pod "9033d600-fe8b-42b1-ac25-2f683c0b0f5e" (UID: "9033d600-fe8b-42b1-ac25-2f683c0b0f5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.614903 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.616292 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.616742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.617827 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.617926 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b9fe8c0-56e7-487a-b68a-905789266b31-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.617984 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.618037 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.618093 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtr6d\" (UniqueName: \"kubernetes.io/projected/f270f16c-6af5-41f6-872d-d46aebe04b6e-kube-api-access-dtr6d\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.618544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9033d600-fe8b-42b1-ac25-2f683c0b0f5e" (UID: "9033d600-fe8b-42b1-ac25-2f683c0b0f5e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.619711 4755 scope.go:117] "RemoveContainer" containerID="f09df42c3e9158b4805e5413425d2afb2f2a186ef2a6ef9c9ebe75fccd1b4ff4" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.619841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z" (OuterVolumeSpecName: "kube-api-access-5v42z") pod "9033d600-fe8b-42b1-ac25-2f683c0b0f5e" (UID: "9033d600-fe8b-42b1-ac25-2f683c0b0f5e"). InnerVolumeSpecName "kube-api-access-5v42z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.627475 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm" (OuterVolumeSpecName: "kube-api-access-gm9vm") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "kube-api-access-gm9vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.628136 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9033d600-fe8b-42b1-ac25-2f683c0b0f5e" (UID: "9033d600-fe8b-42b1-ac25-2f683c0b0f5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721056 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ltq\" (UniqueName: \"kubernetes.io/projected/dff3597a-93e6-4bb6-9508-c8f4609a75fc-kube-api-access-v8ltq\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721197 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff3597a-93e6-4bb6-9508-c8f4609a75fc-logs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721279 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff3597a-93e6-4bb6-9508-c8f4609a75fc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721462 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721564 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721583 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-scripts\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.721746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.731523 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9vm\" (UniqueName: \"kubernetes.io/projected/9b9fe8c0-56e7-487a-b68a-905789266b31-kube-api-access-gm9vm\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.731550 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.731560 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.731569 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v42z\" (UniqueName: \"kubernetes.io/projected/9033d600-fe8b-42b1-ac25-2f683c0b0f5e-kube-api-access-5v42z\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.737889 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.835122 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838778 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ltq\" (UniqueName: \"kubernetes.io/projected/dff3597a-93e6-4bb6-9508-c8f4609a75fc-kube-api-access-v8ltq\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838874 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff3597a-93e6-4bb6-9508-c8f4609a75fc-logs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff3597a-93e6-4bb6-9508-c8f4609a75fc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838951 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.838984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.839000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-scripts\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.839054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.839106 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.839117 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.840352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff3597a-93e6-4bb6-9508-c8f4609a75fc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.843012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff3597a-93e6-4bb6-9508-c8f4609a75fc-logs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.861950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.875612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.875958 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-scripts\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.876139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.880964 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.883848 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.883954 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.888086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff3597a-93e6-4bb6-9508-c8f4609a75fc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.891904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ltq\" (UniqueName: \"kubernetes.io/projected/dff3597a-93e6-4bb6-9508-c8f4609a75fc-kube-api-access-v8ltq\") pod \"cinder-api-0\" (UID: \"dff3597a-93e6-4bb6-9508-c8f4609a75fc\") " pod="openstack/cinder-api-0" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.921308 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config" (OuterVolumeSpecName: "config") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.931599 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f270f16c-6af5-41f6-872d-d46aebe04b6e" (UID: "f270f16c-6af5-41f6-872d-d46aebe04b6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.945088 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.945123 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.945134 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:52 crc kubenswrapper[4755]: I0317 00:45:52.945143 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f270f16c-6af5-41f6-872d-d46aebe04b6e-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.028459 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data" (OuterVolumeSpecName: "config-data") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.037101 4755 scope.go:117] "RemoveContainer" containerID="c89ae1d05b84421fc636ee9a8f47ebb5072db9c7d5af89e80dcd24cbb26e9023" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.039424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.047941 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.101523 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b9fe8c0-56e7-487a-b68a-905789266b31" (UID: "9b9fe8c0-56e7-487a-b68a-905789266b31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.152104 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fe8c0-56e7-487a-b68a-905789266b31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.453451 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" event={"ID":"be10b2f6-c553-40d0-968d-e484111525bc","Type":"ContainerStarted","Data":"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd"} Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.453760 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.453779 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" podUID="be10b2f6-c553-40d0-968d-e484111525bc" containerName="heat-cfnapi" containerID="cri-o://0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd" gracePeriod=60 Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.477908 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" podStartSLOduration=4.308963826 podStartE2EDuration="15.477888009s" podCreationTimestamp="2026-03-17 00:45:38 +0000 UTC" firstStartedPulling="2026-03-17 00:45:40.18671629 +0000 UTC m=+1414.946168573" lastFinishedPulling="2026-03-17 00:45:51.355640473 +0000 UTC m=+1426.115092756" observedRunningTime="2026-03-17 00:45:53.4678344 +0000 UTC m=+1428.227286713" watchObservedRunningTime="2026-03-17 00:45:53.477888009 +0000 UTC m=+1428.237340292" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.478289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"007a5062-42e0-47ac-9523-a4d486614f70","Type":"ContainerStarted","Data":"c432033a410b79385533e2c190b985cf224790625f1222ec2234a7944bcb59d2"} Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.482882 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f5487f5bb-dv7gk" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.487536 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.487583 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78sk9" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.517129 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.217401858 podStartE2EDuration="18.51710992s" podCreationTimestamp="2026-03-17 00:45:35 +0000 UTC" firstStartedPulling="2026-03-17 00:45:36.017562072 +0000 UTC m=+1410.777014355" lastFinishedPulling="2026-03-17 00:45:51.317270134 +0000 UTC m=+1426.076722417" observedRunningTime="2026-03-17 00:45:53.513494163 +0000 UTC m=+1428.272946446" watchObservedRunningTime="2026-03-17 00:45:53.51710992 +0000 UTC m=+1428.276562203" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.568500 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.602725 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.612299 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.629424 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78sk9"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.652423 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.654802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.659965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.660092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.675491 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.718332 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.736975 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f5487f5bb-dv7gk"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.777183 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:45:53 crc kubenswrapper[4755]: W0317 00:45:53.778704 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b12002d_6940_4bb5_83d0_86bc6add52f8.slice/crio-c061768949fcb4cec5afa25f4e0ede8d4863d2172fe14fa531b4def0b82e30e2 WatchSource:0}: Error finding container c061768949fcb4cec5afa25f4e0ede8d4863d2172fe14fa531b4def0b82e30e2: Status 404 returned error can't find the container with id c061768949fcb4cec5afa25f4e0ede8d4863d2172fe14fa531b4def0b82e30e2 Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791574 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791631 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fhrk\" (UniqueName: \"kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.791786 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.814484 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:45:53 crc kubenswrapper[4755]: W0317 00:45:53.822498 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbba9375_8a42_4c43_9b62_0b2df2e89af1.slice/crio-e15bd65c7cafc484012e9ad7b02e586a9f88eeb59d7bbca449ba914f9909cdcd WatchSource:0}: Error finding container e15bd65c7cafc484012e9ad7b02e586a9f88eeb59d7bbca449ba914f9909cdcd: Status 404 returned error can't find the container with id e15bd65c7cafc484012e9ad7b02e586a9f88eeb59d7bbca449ba914f9909cdcd Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.831489 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8bgbn"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922237 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922360 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922484 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fhrk\" (UniqueName: \"kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922528 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.922680 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.923253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.934640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: W0317 00:45:53.945415 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a41a76_23dc_404d_b304_19ad4221ce3d.slice/crio-96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0 WatchSource:0}: Error finding container 96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0: Status 404 returned error can't find the container with id 96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0 Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.945481 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hrkzh"] Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.965008 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fhrk\" (UniqueName: \"kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.979284 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.980166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.980735 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:53 crc kubenswrapper[4755]: I0317 00:45:53.984105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " pod="openstack/ceilometer-0" Mar 17 00:45:54 crc kubenswrapper[4755]: W0317 00:45:54.011540 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf802ad4b_3ce3_451c_a5a3_99ca8a050644.slice/crio-0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65 WatchSource:0}: Error finding container 0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65: Status 404 returned error can't find the container with id 0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65 Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.028430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-16ab-account-create-update-jd78k"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.089735 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3bde-account-create-update-4xt5d"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.119831 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-571c-account-create-update-vrrc6"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.129230 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.161057 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h9477"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.181402 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.205582 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.235966 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b9b5bb667-6pk7q"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.276762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.285329 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9033d600-fe8b-42b1-ac25-2f683c0b0f5e" path="/var/lib/kubelet/pods/9033d600-fe8b-42b1-ac25-2f683c0b0f5e/volumes" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.286078 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9fe8c0-56e7-487a-b68a-905789266b31" path="/var/lib/kubelet/pods/9b9fe8c0-56e7-487a-b68a-905789266b31/volumes" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.287432 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f270f16c-6af5-41f6-872d-d46aebe04b6e" path="/var/lib/kubelet/pods/f270f16c-6af5-41f6-872d-d46aebe04b6e/volumes" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.289024 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f638d5-e4d3-4678-b597-cc6a84235d22" path="/var/lib/kubelet/pods/f9f638d5-e4d3-4678-b597-cc6a84235d22/volumes" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.290391 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.370489 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.473224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") pod \"be10b2f6-c553-40d0-968d-e484111525bc\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.473395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtcxg\" (UniqueName: \"kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg\") pod \"be10b2f6-c553-40d0-968d-e484111525bc\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.473422 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle\") pod \"be10b2f6-c553-40d0-968d-e484111525bc\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.473523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom\") pod \"be10b2f6-c553-40d0-968d-e484111525bc\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.495634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be10b2f6-c553-40d0-968d-e484111525bc" (UID: "be10b2f6-c553-40d0-968d-e484111525bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.495862 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg" (OuterVolumeSpecName: "kube-api-access-xtcxg") pod "be10b2f6-c553-40d0-968d-e484111525bc" (UID: "be10b2f6-c553-40d0-968d-e484111525bc"). InnerVolumeSpecName "kube-api-access-xtcxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.517382 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644dcb55b6-q7jd4" event={"ID":"180ff5f7-b121-458f-b938-d06977e1f610","Type":"ContainerStarted","Data":"6d3fe709f3c796e95850e0206909d027c06bd02b294da028585ab72b30461eed"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.519196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-958555d6d-xlr8p" event={"ID":"cbba9375-8a42-4c43-9b62-0b2df2e89af1","Type":"ContainerStarted","Data":"e15bd65c7cafc484012e9ad7b02e586a9f88eeb59d7bbca449ba914f9909cdcd"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.520246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" event={"ID":"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e","Type":"ContainerStarted","Data":"8d2e4264431ee02b74ce01f0079e83a8f082a6f8a8fe0ea40b00d34868703cbf"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.520977 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h9477" event={"ID":"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586","Type":"ContainerStarted","Data":"12cdec8df65f9749a5fc56c0f0fd1a693354101260700e2c40ba282dca16af71"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.533952 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8bgbn" event={"ID":"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6","Type":"ContainerStarted","Data":"7aa93298a9343f5eaae438f3edc1e11b0ec7c5c9ee4aded5418255f992555c03"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.533989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8bgbn" event={"ID":"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6","Type":"ContainerStarted","Data":"da7df08fd372eda920d9afdab8b5b57f4a37c4e445041e7628264c2f8c7055ff"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.537876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f4c59689b-88k7q" event={"ID":"3b12002d-6940-4bb5-83d0-86bc6add52f8","Type":"ContainerStarted","Data":"c061768949fcb4cec5afa25f4e0ede8d4863d2172fe14fa531b4def0b82e30e2"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.539309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-571c-account-create-update-vrrc6" event={"ID":"3811ee16-75c3-4ca3-9648-d3c9d5f8b028","Type":"ContainerStarted","Data":"ed9fe0dbe63d5749d16e5ab2103d82e8f1d41305dbfb583255d31fc7251782f1"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.540312 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" event={"ID":"cfa93106-8e0c-4e7d-93cf-33d06c85d883","Type":"ContainerStarted","Data":"e1c43a048d1c0c71e53704b58910de35dfd8fd36a67ab6b965e957c1c52183e7"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.541206 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" event={"ID":"6b7b092f-96bf-466c-a3ef-1867c502bb21","Type":"ContainerStarted","Data":"d7c0cb40d4a5c60fec3c390d01fb5fd951c2508f289d247ce0a5994dea4077d3"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.542799 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff3597a-93e6-4bb6-9508-c8f4609a75fc","Type":"ContainerStarted","Data":"f38728e66dd901e067e14004f3a082118cb895ae33a0d2e5d9fd29396791738e"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.570595 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-8bgbn" podStartSLOduration=8.570576094 podStartE2EDuration="8.570576094s" podCreationTimestamp="2026-03-17 00:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:54.553802375 +0000 UTC m=+1429.313254658" watchObservedRunningTime="2026-03-17 00:45:54.570576094 +0000 UTC m=+1429.330028377" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.570921 4755 generic.go:334] "Generic (PLEG): container finished" podID="be10b2f6-c553-40d0-968d-e484111525bc" containerID="0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd" exitCode=0 Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.570982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" event={"ID":"be10b2f6-c553-40d0-968d-e484111525bc","Type":"ContainerDied","Data":"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.571008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" event={"ID":"be10b2f6-c553-40d0-968d-e484111525bc","Type":"ContainerDied","Data":"9e8e480f84d190f254ec33ca42af899088c440c8896d36a55bedf6b0a2aa3f56"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.571027 4755 scope.go:117] "RemoveContainer" containerID="0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.571141 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5699988bfc-7hbpj" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.585722 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.585769 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtcxg\" (UniqueName: \"kubernetes.io/projected/be10b2f6-c553-40d0-968d-e484111525bc-kube-api-access-xtcxg\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.600576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" event={"ID":"f802ad4b-3ce3-451c-a5a3-99ca8a050644","Type":"ContainerStarted","Data":"0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.607842 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrkzh" event={"ID":"c3a41a76-23dc-404d-b304-19ad4221ce3d","Type":"ContainerStarted","Data":"96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.610028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6dc587f546-7lvlg" event={"ID":"e062269a-a4d6-43b8-b065-6d1694b386f8","Type":"ContainerStarted","Data":"1471e1d6422ef347ea48f2c89de4e54700a1a1567acd2a44e135c9c331ee5a2c"} Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.970123 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be10b2f6-c553-40d0-968d-e484111525bc" (UID: "be10b2f6-c553-40d0-968d-e484111525bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:54 crc kubenswrapper[4755]: I0317 00:45:54.996350 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.092079 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.099587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data" (OuterVolumeSpecName: "config-data") pod "be10b2f6-c553-40d0-968d-e484111525bc" (UID: "be10b2f6-c553-40d0-968d-e484111525bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.102254 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") pod \"be10b2f6-c553-40d0-968d-e484111525bc\" (UID: \"be10b2f6-c553-40d0-968d-e484111525bc\") " Mar 17 00:45:55 crc kubenswrapper[4755]: W0317 00:45:55.102362 4755 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/be10b2f6-c553-40d0-968d-e484111525bc/volumes/kubernetes.io~secret/config-data Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.102381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data" (OuterVolumeSpecName: "config-data") pod "be10b2f6-c553-40d0-968d-e484111525bc" (UID: "be10b2f6-c553-40d0-968d-e484111525bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.103222 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be10b2f6-c553-40d0-968d-e484111525bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:55 crc kubenswrapper[4755]: W0317 00:45:55.123587 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded562f97_fa9f_459d_9b46_7f9b2fa7c51c.slice/crio-0c7c67d982e427860c149025759d1985c63c9deccb5ec26a7a49671d25247d33 WatchSource:0}: Error finding container 0c7c67d982e427860c149025759d1985c63c9deccb5ec26a7a49671d25247d33: Status 404 returned error can't find the container with id 0c7c67d982e427860c149025759d1985c63c9deccb5ec26a7a49671d25247d33 Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.328606 4755 scope.go:117] "RemoveContainer" containerID="0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd" Mar 17 00:45:55 crc kubenswrapper[4755]: E0317 00:45:55.332615 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd\": container with ID starting with 0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd not found: ID does not exist" containerID="0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd" Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.332655 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd"} err="failed to get container status \"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd\": rpc error: code = NotFound desc = could not find container \"0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd\": container with ID starting with 0216ab60c3e154eaabcc6cad953cdc8437b344ebbc6c98a22e1f5d83a264aacd not found: ID does not exist" Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.424877 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.447203 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5699988bfc-7hbpj"] Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.622341 4755 generic.go:334] "Generic (PLEG): container finished" podID="3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" containerID="7aa93298a9343f5eaae438f3edc1e11b0ec7c5c9ee4aded5418255f992555c03" exitCode=0 Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.622401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8bgbn" event={"ID":"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6","Type":"ContainerDied","Data":"7aa93298a9343f5eaae438f3edc1e11b0ec7c5c9ee4aded5418255f992555c03"} Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.625240 4755 generic.go:334] "Generic (PLEG): container finished" podID="3811ee16-75c3-4ca3-9648-d3c9d5f8b028" containerID="c00af56ff437bc2e7211e5a43b1ad2aa87fb9cf18c59cc2b9790b5a32c076e2b" exitCode=0 Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.625427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-571c-account-create-update-vrrc6" event={"ID":"3811ee16-75c3-4ca3-9648-d3c9d5f8b028","Type":"ContainerDied","Data":"c00af56ff437bc2e7211e5a43b1ad2aa87fb9cf18c59cc2b9790b5a32c076e2b"} Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.628766 4755 generic.go:334] "Generic (PLEG): container finished" podID="c3a41a76-23dc-404d-b304-19ad4221ce3d" containerID="4d8ca082cc7aa1159aaf9d718f07d36e1c7d56fa3230e8a6ed9391c74f9263fe" exitCode=0 Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.628832 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrkzh" event={"ID":"c3a41a76-23dc-404d-b304-19ad4221ce3d","Type":"ContainerDied","Data":"4d8ca082cc7aa1159aaf9d718f07d36e1c7d56fa3230e8a6ed9391c74f9263fe"} Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.632475 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerStarted","Data":"0c7c67d982e427860c149025759d1985c63c9deccb5ec26a7a49671d25247d33"} Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.643924 4755 generic.go:334] "Generic (PLEG): container finished" podID="2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" containerID="f14028f30cce353568585bc000ef4e43117603a9a88c8b3842b745a1a48ec067" exitCode=0 Mar 17 00:45:55 crc kubenswrapper[4755]: I0317 00:45:55.644031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h9477" event={"ID":"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586","Type":"ContainerDied","Data":"f14028f30cce353568585bc000ef4e43117603a9a88c8b3842b745a1a48ec067"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.291938 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be10b2f6-c553-40d0-968d-e484111525bc" path="/var/lib/kubelet/pods/be10b2f6-c553-40d0-968d-e484111525bc/volumes" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.659090 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f4c59689b-88k7q" event={"ID":"3b12002d-6940-4bb5-83d0-86bc6add52f8","Type":"ContainerStarted","Data":"4878c6fedfd583b26c183a96cf890bb54665c443e20ef74c5c09fb930f2981b0"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.660381 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.665521 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644dcb55b6-q7jd4" event={"ID":"180ff5f7-b121-458f-b938-d06977e1f610","Type":"ContainerStarted","Data":"9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.666611 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.671175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" event={"ID":"f802ad4b-3ce3-451c-a5a3-99ca8a050644","Type":"ContainerStarted","Data":"aba64e1a808044f0a24814eda28fd0b2184327c9dd563b80e858e77591d387e5"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.675039 4755 generic.go:334] "Generic (PLEG): container finished" podID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerID="4d7c2fc610d049e57d23358d67ebaab1c37018a577a4b0ab55bb61339eeb62ff" exitCode=1 Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.675097 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" event={"ID":"6b7b092f-96bf-466c-a3ef-1867c502bb21","Type":"ContainerDied","Data":"4d7c2fc610d049e57d23358d67ebaab1c37018a577a4b0ab55bb61339eeb62ff"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.675775 4755 scope.go:117] "RemoveContainer" containerID="4d7c2fc610d049e57d23358d67ebaab1c37018a577a4b0ab55bb61339eeb62ff" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.686957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerStarted","Data":"b5fe68aef2ccee6f7b271ff01002910729411001fa26880574d64cf0205ed4ff"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.694644 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7f4c59689b-88k7q" podStartSLOduration=8.781315139 podStartE2EDuration="9.694613909s" podCreationTimestamp="2026-03-17 00:45:47 +0000 UTC" firstStartedPulling="2026-03-17 00:45:53.785984161 +0000 UTC m=+1428.545436444" lastFinishedPulling="2026-03-17 00:45:54.699282941 +0000 UTC m=+1429.458735214" observedRunningTime="2026-03-17 00:45:56.678386264 +0000 UTC m=+1431.437838547" watchObservedRunningTime="2026-03-17 00:45:56.694613909 +0000 UTC m=+1431.454066192" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.710790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff3597a-93e6-4bb6-9508-c8f4609a75fc","Type":"ContainerStarted","Data":"07a33f66554e26734bd751a07f36dff76d88f3fcdbaf93488f9480e17db72b37"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.714244 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" podStartSLOduration=10.714224824 podStartE2EDuration="10.714224824s" podCreationTimestamp="2026-03-17 00:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:56.698898324 +0000 UTC m=+1431.458350607" watchObservedRunningTime="2026-03-17 00:45:56.714224824 +0000 UTC m=+1431.473677107" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.715304 4755 generic.go:334] "Generic (PLEG): container finished" podID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerID="367926496fd057dd273eee3f804864d883d78e22ce0d121bbe2fae54fd86ed94" exitCode=1 Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.715360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6dc587f546-7lvlg" event={"ID":"e062269a-a4d6-43b8-b065-6d1694b386f8","Type":"ContainerDied","Data":"367926496fd057dd273eee3f804864d883d78e22ce0d121bbe2fae54fd86ed94"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.716242 4755 scope.go:117] "RemoveContainer" containerID="367926496fd057dd273eee3f804864d883d78e22ce0d121bbe2fae54fd86ed94" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.725413 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-644dcb55b6-q7jd4" podStartSLOduration=11.725396094 podStartE2EDuration="11.725396094s" podCreationTimestamp="2026-03-17 00:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:56.722132316 +0000 UTC m=+1431.481584599" watchObservedRunningTime="2026-03-17 00:45:56.725396094 +0000 UTC m=+1431.484848377" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.750966 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" containerID="4c717171c7f61c51db6c1bfc2b3955f86a6fb1638a0d4e1689a6d097e29d1521" exitCode=0 Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.751057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" event={"ID":"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e","Type":"ContainerDied","Data":"4c717171c7f61c51db6c1bfc2b3955f86a6fb1638a0d4e1689a6d097e29d1521"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.756996 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" event={"ID":"cfa93106-8e0c-4e7d-93cf-33d06c85d883","Type":"ContainerStarted","Data":"5b13ba7e2fcec99b4b6b5beed04a371b3a36e176c8179e5dff2752207ee7519e"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.757068 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" event={"ID":"cfa93106-8e0c-4e7d-93cf-33d06c85d883","Type":"ContainerStarted","Data":"3644dad671efed98f73b9876b1883d2875f7a3a5f0fb26867320478cf1dfe015"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.759272 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.759321 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.763098 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-958555d6d-xlr8p" event={"ID":"cbba9375-8a42-4c43-9b62-0b2df2e89af1","Type":"ContainerStarted","Data":"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063"} Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.763941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.818626 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" podStartSLOduration=14.818607279 podStartE2EDuration="14.818607279s" podCreationTimestamp="2026-03-17 00:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:56.797532895 +0000 UTC m=+1431.556985188" watchObservedRunningTime="2026-03-17 00:45:56.818607279 +0000 UTC m=+1431.578059562" Mar 17 00:45:56 crc kubenswrapper[4755]: I0317 00:45:56.828110 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-958555d6d-xlr8p" podStartSLOduration=9.828092484 podStartE2EDuration="9.828092484s" podCreationTimestamp="2026-03-17 00:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:56.821084156 +0000 UTC m=+1431.580536439" watchObservedRunningTime="2026-03-17 00:45:56.828092484 +0000 UTC m=+1431.587544767" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.423850 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.464368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts\") pod \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.464533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhrj\" (UniqueName: \"kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj\") pod \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\" (UID: \"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.466850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" (UID: "2c5ef244-d2dc-4cc0-bc15-8f542eb8a586"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.497234 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj" (OuterVolumeSpecName: "kube-api-access-9hhrj") pod "2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" (UID: "2c5ef244-d2dc-4cc0-bc15-8f542eb8a586"). InnerVolumeSpecName "kube-api-access-9hhrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.566774 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.566808 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhrj\" (UniqueName: \"kubernetes.io/projected/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586-kube-api-access-9hhrj\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.626844 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.663929 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.670183 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv48d\" (UniqueName: \"kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d\") pod \"c3a41a76-23dc-404d-b304-19ad4221ce3d\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.676812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts\") pod \"c3a41a76-23dc-404d-b304-19ad4221ce3d\" (UID: \"c3a41a76-23dc-404d-b304-19ad4221ce3d\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.676851 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3a41a76-23dc-404d-b304-19ad4221ce3d" (UID: "c3a41a76-23dc-404d-b304-19ad4221ce3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.677754 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.677744 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d" (OuterVolumeSpecName: "kube-api-access-fv48d") pod "c3a41a76-23dc-404d-b304-19ad4221ce3d" (UID: "c3a41a76-23dc-404d-b304-19ad4221ce3d"). InnerVolumeSpecName "kube-api-access-fv48d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.680042 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a41a76-23dc-404d-b304-19ad4221ce3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.680067 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv48d\" (UniqueName: \"kubernetes.io/projected/c3a41a76-23dc-404d-b304-19ad4221ce3d-kube-api-access-fv48d\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.697086 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.775668 4755 generic.go:334] "Generic (PLEG): container finished" podID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerID="565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681" exitCode=1 Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.775751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6dc587f546-7lvlg" event={"ID":"e062269a-a4d6-43b8-b065-6d1694b386f8","Type":"ContainerDied","Data":"565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.775786 4755 scope.go:117] "RemoveContainer" containerID="367926496fd057dd273eee3f804864d883d78e22ce0d121bbe2fae54fd86ed94" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.776626 4755 scope.go:117] "RemoveContainer" containerID="565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681" Mar 17 00:45:57 crc kubenswrapper[4755]: E0317 00:45:57.777054 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6dc587f546-7lvlg_openstack(e062269a-a4d6-43b8-b065-6d1694b386f8)\"" pod="openstack/heat-api-6dc587f546-7lvlg" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.780788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9lff\" (UniqueName: \"kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff\") pod \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.780892 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config\") pod \"ab92471f-7493-4924-a5d7-c194d62e821f\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.780986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs\") pod \"ab92471f-7493-4924-a5d7-c194d62e821f\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781139 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config\") pod \"ab92471f-7493-4924-a5d7-c194d62e821f\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4htr\" (UniqueName: \"kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr\") pod \"ab92471f-7493-4924-a5d7-c194d62e821f\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781508 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle\") pod \"ab92471f-7493-4924-a5d7-c194d62e821f\" (UID: \"ab92471f-7493-4924-a5d7-c194d62e821f\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781634 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts\") pod \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts\") pod \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\" (UID: \"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.781906 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk25f\" (UniqueName: \"kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f\") pod \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\" (UID: \"3811ee16-75c3-4ca3-9648-d3c9d5f8b028\") " Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.785688 4755 generic.go:334] "Generic (PLEG): container finished" podID="f802ad4b-3ce3-451c-a5a3-99ca8a050644" containerID="aba64e1a808044f0a24814eda28fd0b2184327c9dd563b80e858e77591d387e5" exitCode=0 Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.785894 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" event={"ID":"f802ad4b-3ce3-451c-a5a3-99ca8a050644","Type":"ContainerDied","Data":"aba64e1a808044f0a24814eda28fd0b2184327c9dd563b80e858e77591d387e5"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.785967 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3811ee16-75c3-4ca3-9648-d3c9d5f8b028" (UID: "3811ee16-75c3-4ca3-9648-d3c9d5f8b028"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.786214 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" (UID: "3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.797103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrkzh" event={"ID":"c3a41a76-23dc-404d-b304-19ad4221ce3d","Type":"ContainerDied","Data":"96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.797143 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96fdbe6194240c8eccad50b34902169bf119200da58d3bae17fc1a90daea97c0" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.797195 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrkzh" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.799179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr" (OuterVolumeSpecName: "kube-api-access-l4htr") pod "ab92471f-7493-4924-a5d7-c194d62e821f" (UID: "ab92471f-7493-4924-a5d7-c194d62e821f"). InnerVolumeSpecName "kube-api-access-l4htr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.801675 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ab92471f-7493-4924-a5d7-c194d62e821f" (UID: "ab92471f-7493-4924-a5d7-c194d62e821f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.801772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff" (OuterVolumeSpecName: "kube-api-access-c9lff") pod "3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" (UID: "3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6"). InnerVolumeSpecName "kube-api-access-c9lff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.801798 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f" (OuterVolumeSpecName: "kube-api-access-qk25f") pod "3811ee16-75c3-4ca3-9648-d3c9d5f8b028" (UID: "3811ee16-75c3-4ca3-9648-d3c9d5f8b028"). InnerVolumeSpecName "kube-api-access-qk25f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.817077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerStarted","Data":"15df363e51ac11bc797eeec0e250cf250006e92f5aef5cb01762044c0874fbac"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.849789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8bgbn" event={"ID":"3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6","Type":"ContainerDied","Data":"da7df08fd372eda920d9afdab8b5b57f4a37c4e445041e7628264c2f8c7055ff"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.849833 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7df08fd372eda920d9afdab8b5b57f4a37c4e445041e7628264c2f8c7055ff" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.849891 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8bgbn" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.855830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-571c-account-create-update-vrrc6" event={"ID":"3811ee16-75c3-4ca3-9648-d3c9d5f8b028","Type":"ContainerDied","Data":"ed9fe0dbe63d5749d16e5ab2103d82e8f1d41305dbfb583255d31fc7251782f1"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.855859 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9fe0dbe63d5749d16e5ab2103d82e8f1d41305dbfb583255d31fc7251782f1" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.855841 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-571c-account-create-update-vrrc6" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.858645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config" (OuterVolumeSpecName: "config") pod "ab92471f-7493-4924-a5d7-c194d62e821f" (UID: "ab92471f-7493-4924-a5d7-c194d62e821f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.865730 4755 generic.go:334] "Generic (PLEG): container finished" podID="ab92471f-7493-4924-a5d7-c194d62e821f" containerID="317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611" exitCode=0 Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.865928 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerDied","Data":"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.866033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df477f8d4-2cfqn" event={"ID":"ab92471f-7493-4924-a5d7-c194d62e821f","Type":"ContainerDied","Data":"36af6a2b8d72f78a16c5a90b9e7851607dc9aaee285458654617420c2ca2a353"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.866873 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df477f8d4-2cfqn" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.877670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" event={"ID":"6b7b092f-96bf-466c-a3ef-1867c502bb21","Type":"ContainerStarted","Data":"4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.878349 4755 scope.go:117] "RemoveContainer" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" Mar 17 00:45:57 crc kubenswrapper[4755]: E0317 00:45:57.878833 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7cf59db8d9-lkrks_openstack(6b7b092f-96bf-466c-a3ef-1867c502bb21)\"" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883862 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883882 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk25f\" (UniqueName: \"kubernetes.io/projected/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-kube-api-access-qk25f\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883892 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9lff\" (UniqueName: \"kubernetes.io/projected/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6-kube-api-access-c9lff\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883901 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883910 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883918 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4htr\" (UniqueName: \"kubernetes.io/projected/ab92471f-7493-4924-a5d7-c194d62e821f-kube-api-access-l4htr\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.883926 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3811ee16-75c3-4ca3-9648-d3c9d5f8b028-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.905871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff3597a-93e6-4bb6-9508-c8f4609a75fc","Type":"ContainerStarted","Data":"f2d36aaa731735484c2093ebcbc46cfcd3454a87d1aaad8ac174c0b2b9664aff"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.906613 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.914908 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab92471f-7493-4924-a5d7-c194d62e821f" (UID: "ab92471f-7493-4924-a5d7-c194d62e821f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.928238 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h9477" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.928533 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h9477" event={"ID":"2c5ef244-d2dc-4cc0-bc15-8f542eb8a586","Type":"ContainerDied","Data":"12cdec8df65f9749a5fc56c0f0fd1a693354101260700e2c40ba282dca16af71"} Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.928811 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12cdec8df65f9749a5fc56c0f0fd1a693354101260700e2c40ba282dca16af71" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.938802 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.93878658 podStartE2EDuration="5.93878658s" podCreationTimestamp="2026-03-17 00:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:45:57.938069501 +0000 UTC m=+1432.697521804" watchObservedRunningTime="2026-03-17 00:45:57.93878658 +0000 UTC m=+1432.698238863" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.985531 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ab92471f-7493-4924-a5d7-c194d62e821f" (UID: "ab92471f-7493-4924-a5d7-c194d62e821f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.987201 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:57 crc kubenswrapper[4755]: I0317 00:45:57.987227 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab92471f-7493-4924-a5d7-c194d62e821f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.082919 4755 scope.go:117] "RemoveContainer" containerID="b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.134528 4755 scope.go:117] "RemoveContainer" containerID="317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.166128 4755 scope.go:117] "RemoveContainer" containerID="b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f" Mar 17 00:45:58 crc kubenswrapper[4755]: E0317 00:45:58.167599 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f\": container with ID starting with b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f not found: ID does not exist" containerID="b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.167649 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f"} err="failed to get container status \"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f\": rpc error: code = NotFound desc = could not find container \"b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f\": container with ID starting with b18159828b0ef85d52a55272de389f2449a3c17f45057e6a855ea454c1f3650f not found: ID does not exist" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.167671 4755 scope.go:117] "RemoveContainer" containerID="317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611" Mar 17 00:45:58 crc kubenswrapper[4755]: E0317 00:45:58.170297 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611\": container with ID starting with 317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611 not found: ID does not exist" containerID="317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.170329 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611"} err="failed to get container status \"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611\": rpc error: code = NotFound desc = could not find container \"317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611\": container with ID starting with 317b054ee6ff7b48069e0a2dc3ddaa653d4ac7a15ea409f9ca61c69c8644f611 not found: ID does not exist" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.225900 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.292100 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-df477f8d4-2cfqn"] Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.398827 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.497491 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts\") pod \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.497540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svxkb\" (UniqueName: \"kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb\") pod \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\" (UID: \"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e\") " Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.499184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" (UID: "2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.501850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb" (OuterVolumeSpecName: "kube-api-access-svxkb") pod "2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" (UID: "2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e"). InnerVolumeSpecName "kube-api-access-svxkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.600538 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.600568 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svxkb\" (UniqueName: \"kubernetes.io/projected/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e-kube-api-access-svxkb\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.665767 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.665825 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.936077 4755 scope.go:117] "RemoveContainer" containerID="565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681" Mar 17 00:45:58 crc kubenswrapper[4755]: E0317 00:45:58.936345 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6dc587f546-7lvlg_openstack(e062269a-a4d6-43b8-b065-6d1694b386f8)\"" pod="openstack/heat-api-6dc587f546-7lvlg" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.937106 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.937117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-16ab-account-create-update-jd78k" event={"ID":"2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e","Type":"ContainerDied","Data":"8d2e4264431ee02b74ce01f0079e83a8f082a6f8a8fe0ea40b00d34868703cbf"} Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.937232 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d2e4264431ee02b74ce01f0079e83a8f082a6f8a8fe0ea40b00d34868703cbf" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.940725 4755 generic.go:334] "Generic (PLEG): container finished" podID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" exitCode=1 Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.940771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" event={"ID":"6b7b092f-96bf-466c-a3ef-1867c502bb21","Type":"ContainerDied","Data":"4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860"} Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.940794 4755 scope.go:117] "RemoveContainer" containerID="4d7c2fc610d049e57d23358d67ebaab1c37018a577a4b0ab55bb61339eeb62ff" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.941416 4755 scope.go:117] "RemoveContainer" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" Mar 17 00:45:58 crc kubenswrapper[4755]: E0317 00:45:58.941633 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7cf59db8d9-lkrks_openstack(6b7b092f-96bf-466c-a3ef-1867c502bb21)\"" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" Mar 17 00:45:58 crc kubenswrapper[4755]: I0317 00:45:58.945258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerStarted","Data":"dfd018718c87ba95527da66d356be22b96440a4e9ffeee7c8236b931d86dcec2"} Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.120761 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.487978 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.629202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6m4z\" (UniqueName: \"kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z\") pod \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.629593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts\") pod \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\" (UID: \"f802ad4b-3ce3-451c-a5a3-99ca8a050644\") " Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.630278 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f802ad4b-3ce3-451c-a5a3-99ca8a050644" (UID: "f802ad4b-3ce3-451c-a5a3-99ca8a050644"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.630595 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f802ad4b-3ce3-451c-a5a3-99ca8a050644-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.638638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z" (OuterVolumeSpecName: "kube-api-access-k6m4z") pod "f802ad4b-3ce3-451c-a5a3-99ca8a050644" (UID: "f802ad4b-3ce3-451c-a5a3-99ca8a050644"). InnerVolumeSpecName "kube-api-access-k6m4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.733141 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6m4z\" (UniqueName: \"kubernetes.io/projected/f802ad4b-3ce3-451c-a5a3-99ca8a050644-kube-api-access-k6m4z\") on node \"crc\" DevicePath \"\"" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.955473 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" event={"ID":"f802ad4b-3ce3-451c-a5a3-99ca8a050644","Type":"ContainerDied","Data":"0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65"} Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.955509 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddf119a2571f95e3d8af1330edfbcdc09cf85b0303b9b1bc1d255c111bfac65" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.955557 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3bde-account-create-update-4xt5d" Mar 17 00:45:59 crc kubenswrapper[4755]: I0317 00:45:59.960226 4755 scope.go:117] "RemoveContainer" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" Mar 17 00:45:59 crc kubenswrapper[4755]: E0317 00:45:59.960529 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7cf59db8d9-lkrks_openstack(6b7b092f-96bf-466c-a3ef-1867c502bb21)\"" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.206937 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561806-v78jd"] Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207556 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f802ad4b-3ce3-451c-a5a3-99ca8a050644" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207579 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f802ad4b-3ce3-451c-a5a3-99ca8a050644" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207616 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be10b2f6-c553-40d0-968d-e484111525bc" containerName="heat-cfnapi" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207625 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be10b2f6-c553-40d0-968d-e484111525bc" containerName="heat-cfnapi" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207653 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207660 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207679 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-api" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207687 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-api" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207700 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3811ee16-75c3-4ca3-9648-d3c9d5f8b028" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207707 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3811ee16-75c3-4ca3-9648-d3c9d5f8b028" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207716 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207723 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207735 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207743 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207758 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-httpd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207765 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-httpd" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.207780 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a41a76-23dc-404d-b304-19ad4221ce3d" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.207787 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a41a76-23dc-404d-b304-19ad4221ce3d" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208039 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f802ad4b-3ce3-451c-a5a3-99ca8a050644" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208060 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-api" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208075 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="be10b2f6-c553-40d0-968d-e484111525bc" containerName="heat-cfnapi" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208090 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" containerName="neutron-httpd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208101 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208117 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a41a76-23dc-404d-b304-19ad4221ce3d" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208127 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3811ee16-75c3-4ca3-9648-d3c9d5f8b028" containerName="mariadb-account-create-update" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.208153 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" containerName="mariadb-database-create" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.266537 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.277100 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.277307 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.277481 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.324143 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab92471f-7493-4924-a5d7-c194d62e821f" path="/var/lib/kubelet/pods/ab92471f-7493-4924-a5d7-c194d62e821f/volumes" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.324776 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561806-v78jd"] Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.451771 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9hk\" (UniqueName: \"kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk\") pod \"auto-csr-approver-29561806-v78jd\" (UID: \"7f4650d5-3d6d-4437-9b7b-f585de970b8f\") " pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.554225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9hk\" (UniqueName: \"kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk\") pod \"auto-csr-approver-29561806-v78jd\" (UID: \"7f4650d5-3d6d-4437-9b7b-f585de970b8f\") " pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.598253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9hk\" (UniqueName: \"kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk\") pod \"auto-csr-approver-29561806-v78jd\" (UID: \"7f4650d5-3d6d-4437-9b7b-f585de970b8f\") " pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.677785 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.788984 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.789280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.790203 4755 scope.go:117] "RemoveContainer" containerID="565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.790480 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6dc587f546-7lvlg_openstack(e062269a-a4d6-43b8-b065-6d1694b386f8)\"" pod="openstack/heat-api-6dc587f546-7lvlg" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.828825 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.828886 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.982201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerStarted","Data":"5ce0b2a81577b7b47275a655fb6617b5f0a5b0549eb9f2f517f2e1d599fdfebb"} Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.982700 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:46:00 crc kubenswrapper[4755]: I0317 00:46:00.982925 4755 scope.go:117] "RemoveContainer" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" Mar 17 00:46:00 crc kubenswrapper[4755]: E0317 00:46:00.983234 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7cf59db8d9-lkrks_openstack(6b7b092f-96bf-466c-a3ef-1867c502bb21)\"" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.021277 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.039287308 podStartE2EDuration="8.021255214s" podCreationTimestamp="2026-03-17 00:45:53 +0000 UTC" firstStartedPulling="2026-03-17 00:45:55.133632533 +0000 UTC m=+1429.893084816" lastFinishedPulling="2026-03-17 00:46:00.115600439 +0000 UTC m=+1434.875052722" observedRunningTime="2026-03-17 00:46:01.010055284 +0000 UTC m=+1435.769507567" watchObservedRunningTime="2026-03-17 00:46:01.021255214 +0000 UTC m=+1435.780707517" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.048333 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.129383 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561806-v78jd"] Mar 17 00:46:01 crc kubenswrapper[4755]: W0317 00:46:01.130182 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f4650d5_3d6d_4437_9b7b_f585de970b8f.slice/crio-08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30 WatchSource:0}: Error finding container 08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30: Status 404 returned error can't find the container with id 08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30 Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.957789 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r6dbj"] Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.966623 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.970156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ls4gf" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.970312 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.970409 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 17 00:46:01 crc kubenswrapper[4755]: I0317 00:46:01.980002 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r6dbj"] Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.005237 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561806-v78jd" event={"ID":"7f4650d5-3d6d-4437-9b7b-f585de970b8f","Type":"ContainerStarted","Data":"08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30"} Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.115543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745rm\" (UniqueName: \"kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.115607 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.115643 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.116015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.218055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.218351 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745rm\" (UniqueName: \"kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.218387 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.218417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.223850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.224069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.224870 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.235992 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745rm\" (UniqueName: \"kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm\") pod \"nova-cell0-conductor-db-sync-r6dbj\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.291533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.417291 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.418782 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" Mar 17 00:46:02 crc kubenswrapper[4755]: I0317 00:46:02.903214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r6dbj"] Mar 17 00:46:02 crc kubenswrapper[4755]: W0317 00:46:02.907072 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce90939_ded2_4efa_90c9_c74df06b5bcd.slice/crio-07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776 WatchSource:0}: Error finding container 07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776: Status 404 returned error can't find the container with id 07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776 Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.007484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561806-v78jd" event={"ID":"7f4650d5-3d6d-4437-9b7b-f585de970b8f","Type":"ContainerStarted","Data":"40b22a1015e98904677fbabe7e3e16b1a5b5c36f7445375b29470cb8aae12ffa"} Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.008864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" event={"ID":"cce90939-ded2-4efa-90c9-c74df06b5bcd","Type":"ContainerStarted","Data":"07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776"} Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.009010 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-central-agent" containerID="cri-o://b5fe68aef2ccee6f7b271ff01002910729411001fa26880574d64cf0205ed4ff" gracePeriod=30 Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.009043 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="sg-core" containerID="cri-o://dfd018718c87ba95527da66d356be22b96440a4e9ffeee7c8236b931d86dcec2" gracePeriod=30 Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.009062 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="proxy-httpd" containerID="cri-o://5ce0b2a81577b7b47275a655fb6617b5f0a5b0549eb9f2f517f2e1d599fdfebb" gracePeriod=30 Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.009074 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-notification-agent" containerID="cri-o://15df363e51ac11bc797eeec0e250cf250006e92f5aef5cb01762044c0874fbac" gracePeriod=30 Mar 17 00:46:03 crc kubenswrapper[4755]: I0317 00:46:03.031908 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561806-v78jd" podStartSLOduration=2.063621951 podStartE2EDuration="3.031891762s" podCreationTimestamp="2026-03-17 00:46:00 +0000 UTC" firstStartedPulling="2026-03-17 00:46:01.134072876 +0000 UTC m=+1435.893525149" lastFinishedPulling="2026-03-17 00:46:02.102342677 +0000 UTC m=+1436.861794960" observedRunningTime="2026-03-17 00:46:03.021676689 +0000 UTC m=+1437.781128972" watchObservedRunningTime="2026-03-17 00:46:03.031891762 +0000 UTC m=+1437.791344045" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.028157 4755 generic.go:334] "Generic (PLEG): container finished" podID="7f4650d5-3d6d-4437-9b7b-f585de970b8f" containerID="40b22a1015e98904677fbabe7e3e16b1a5b5c36f7445375b29470cb8aae12ffa" exitCode=0 Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.028622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561806-v78jd" event={"ID":"7f4650d5-3d6d-4437-9b7b-f585de970b8f","Type":"ContainerDied","Data":"40b22a1015e98904677fbabe7e3e16b1a5b5c36f7445375b29470cb8aae12ffa"} Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045791 4755 generic.go:334] "Generic (PLEG): container finished" podID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerID="5ce0b2a81577b7b47275a655fb6617b5f0a5b0549eb9f2f517f2e1d599fdfebb" exitCode=0 Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045831 4755 generic.go:334] "Generic (PLEG): container finished" podID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerID="dfd018718c87ba95527da66d356be22b96440a4e9ffeee7c8236b931d86dcec2" exitCode=2 Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045842 4755 generic.go:334] "Generic (PLEG): container finished" podID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerID="15df363e51ac11bc797eeec0e250cf250006e92f5aef5cb01762044c0874fbac" exitCode=0 Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045851 4755 generic.go:334] "Generic (PLEG): container finished" podID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerID="b5fe68aef2ccee6f7b271ff01002910729411001fa26880574d64cf0205ed4ff" exitCode=0 Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerDied","Data":"5ce0b2a81577b7b47275a655fb6617b5f0a5b0549eb9f2f517f2e1d599fdfebb"} Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerDied","Data":"dfd018718c87ba95527da66d356be22b96440a4e9ffeee7c8236b931d86dcec2"} Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045960 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerDied","Data":"15df363e51ac11bc797eeec0e250cf250006e92f5aef5cb01762044c0874fbac"} Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.045971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerDied","Data":"b5fe68aef2ccee6f7b271ff01002910729411001fa26880574d64cf0205ed4ff"} Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.155725 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262586 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262660 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262717 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.262904 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fhrk\" (UniqueName: \"kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk\") pod \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\" (UID: \"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c\") " Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.263912 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.264237 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.290113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk" (OuterVolumeSpecName: "kube-api-access-2fhrk") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "kube-api-access-2fhrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.292622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts" (OuterVolumeSpecName: "scripts") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.319526 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.374075 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fhrk\" (UniqueName: \"kubernetes.io/projected/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-kube-api-access-2fhrk\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.374114 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.374124 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.374135 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.374143 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.424564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data" (OuterVolumeSpecName: "config-data") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.454593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" (UID: "ed562f97-fa9f-459d-9b46-7f9b2fa7c51c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.476774 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:04 crc kubenswrapper[4755]: I0317 00:46:04.476803 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.058566 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.062634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed562f97-fa9f-459d-9b46-7f9b2fa7c51c","Type":"ContainerDied","Data":"0c7c67d982e427860c149025759d1985c63c9deccb5ec26a7a49671d25247d33"} Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.062719 4755 scope.go:117] "RemoveContainer" containerID="5ce0b2a81577b7b47275a655fb6617b5f0a5b0549eb9f2f517f2e1d599fdfebb" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.102775 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.117696 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.132338 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:05 crc kubenswrapper[4755]: E0317 00:46:05.132827 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="sg-core" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.132844 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="sg-core" Mar 17 00:46:05 crc kubenswrapper[4755]: E0317 00:46:05.132868 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-notification-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.132875 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-notification-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: E0317 00:46:05.132887 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="proxy-httpd" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.132893 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="proxy-httpd" Mar 17 00:46:05 crc kubenswrapper[4755]: E0317 00:46:05.132912 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-central-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.132918 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-central-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.133135 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-notification-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.133154 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="ceilometer-central-agent" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.133173 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="sg-core" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.133182 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" containerName="proxy-httpd" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.135025 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.137235 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.137458 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.143118 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291009 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291086 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291108 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291127 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291255 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.291301 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgsvg\" (UniqueName: \"kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgsvg\" (UniqueName: \"kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.394859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.395289 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.395597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.416794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.418213 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.422096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.425801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgsvg\" (UniqueName: \"kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.443356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.452210 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.481305 4755 scope.go:117] "RemoveContainer" containerID="dfd018718c87ba95527da66d356be22b96440a4e9ffeee7c8236b931d86dcec2" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.663133 4755 scope.go:117] "RemoveContainer" containerID="15df363e51ac11bc797eeec0e250cf250006e92f5aef5cb01762044c0874fbac" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.697162 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.702240 4755 scope.go:117] "RemoveContainer" containerID="b5fe68aef2ccee6f7b271ff01002910729411001fa26880574d64cf0205ed4ff" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.801778 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns9hk\" (UniqueName: \"kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk\") pod \"7f4650d5-3d6d-4437-9b7b-f585de970b8f\" (UID: \"7f4650d5-3d6d-4437-9b7b-f585de970b8f\") " Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.826758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk" (OuterVolumeSpecName: "kube-api-access-ns9hk") pod "7f4650d5-3d6d-4437-9b7b-f585de970b8f" (UID: "7f4650d5-3d6d-4437-9b7b-f585de970b8f"). InnerVolumeSpecName "kube-api-access-ns9hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.827084 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.862898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.904814 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns9hk\" (UniqueName: \"kubernetes.io/projected/7f4650d5-3d6d-4437-9b7b-f585de970b8f-kube-api-access-ns9hk\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.926224 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:46:05 crc kubenswrapper[4755]: I0317 00:46:05.926567 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5db85cdfc7-s8bqf" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerName="heat-engine" containerID="cri-o://6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" gracePeriod=60 Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.072328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.129372 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561800-mktcc"] Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.132649 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561806-v78jd" event={"ID":"7f4650d5-3d6d-4437-9b7b-f585de970b8f","Type":"ContainerDied","Data":"08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30"} Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.132689 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08940f828e0dd523a3c4c174f8a1b6bab6d0f34210aeb17fe78de37c95e9db30" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.132759 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561806-v78jd" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.156298 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561800-mktcc"] Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.332212 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24671c08-dfdb-4659-836d-83cec2bbbbb8" path="/var/lib/kubelet/pods/24671c08-dfdb-4659-836d-83cec2bbbbb8/volumes" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.333797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed562f97-fa9f-459d-9b46-7f9b2fa7c51c" path="/var/lib/kubelet/pods/ed562f97-fa9f-459d-9b46-7f9b2fa7c51c/volumes" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.538031 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.600448 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.709304 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:46:06 crc kubenswrapper[4755]: I0317 00:46:06.767749 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.153810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerStarted","Data":"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046"} Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.154065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerStarted","Data":"04e8415d434c11ed8c8be70d41b27974208736e9458fd9967d2a5eed436967ca"} Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.245220 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.284728 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.345377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle\") pod \"6b7b092f-96bf-466c-a3ef-1867c502bb21\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.345475 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom\") pod \"6b7b092f-96bf-466c-a3ef-1867c502bb21\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.345548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data\") pod \"6b7b092f-96bf-466c-a3ef-1867c502bb21\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.345592 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddskj\" (UniqueName: \"kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj\") pod \"6b7b092f-96bf-466c-a3ef-1867c502bb21\" (UID: \"6b7b092f-96bf-466c-a3ef-1867c502bb21\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.353909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj" (OuterVolumeSpecName: "kube-api-access-ddskj") pod "6b7b092f-96bf-466c-a3ef-1867c502bb21" (UID: "6b7b092f-96bf-466c-a3ef-1867c502bb21"). InnerVolumeSpecName "kube-api-access-ddskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.355066 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b7b092f-96bf-466c-a3ef-1867c502bb21" (UID: "6b7b092f-96bf-466c-a3ef-1867c502bb21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.380691 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b7b092f-96bf-466c-a3ef-1867c502bb21" (UID: "6b7b092f-96bf-466c-a3ef-1867c502bb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.401647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data" (OuterVolumeSpecName: "config-data") pod "6b7b092f-96bf-466c-a3ef-1867c502bb21" (UID: "6b7b092f-96bf-466c-a3ef-1867c502bb21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448054 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfzk5\" (UniqueName: \"kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5\") pod \"e062269a-a4d6-43b8-b065-6d1694b386f8\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle\") pod \"e062269a-a4d6-43b8-b065-6d1694b386f8\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448165 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom\") pod \"e062269a-a4d6-43b8-b065-6d1694b386f8\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data\") pod \"e062269a-a4d6-43b8-b065-6d1694b386f8\" (UID: \"e062269a-a4d6-43b8-b065-6d1694b386f8\") " Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448696 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448709 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448720 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b7b092f-96bf-466c-a3ef-1867c502bb21-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.448728 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddskj\" (UniqueName: \"kubernetes.io/projected/6b7b092f-96bf-466c-a3ef-1867c502bb21-kube-api-access-ddskj\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.453833 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e062269a-a4d6-43b8-b065-6d1694b386f8" (UID: "e062269a-a4d6-43b8-b065-6d1694b386f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.458898 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5" (OuterVolumeSpecName: "kube-api-access-tfzk5") pod "e062269a-a4d6-43b8-b065-6d1694b386f8" (UID: "e062269a-a4d6-43b8-b065-6d1694b386f8"). InnerVolumeSpecName "kube-api-access-tfzk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.496619 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e062269a-a4d6-43b8-b065-6d1694b386f8" (UID: "e062269a-a4d6-43b8-b065-6d1694b386f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.539704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data" (OuterVolumeSpecName: "config-data") pod "e062269a-a4d6-43b8-b065-6d1694b386f8" (UID: "e062269a-a4d6-43b8-b065-6d1694b386f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.550151 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfzk5\" (UniqueName: \"kubernetes.io/projected/e062269a-a4d6-43b8-b065-6d1694b386f8-kube-api-access-tfzk5\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.550183 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.550196 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:07 crc kubenswrapper[4755]: I0317 00:46:07.550206 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e062269a-a4d6-43b8-b065-6d1694b386f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.184864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerStarted","Data":"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779"} Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.198850 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6dc587f546-7lvlg" Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.198865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6dc587f546-7lvlg" event={"ID":"e062269a-a4d6-43b8-b065-6d1694b386f8","Type":"ContainerDied","Data":"1471e1d6422ef347ea48f2c89de4e54700a1a1567acd2a44e135c9c331ee5a2c"} Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.198911 4755 scope.go:117] "RemoveContainer" containerID="565e4ad6cddf14cdd48591cb4803decd6685f0238caee9493fa80a8d0d60e681" Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.205062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" event={"ID":"6b7b092f-96bf-466c-a3ef-1867c502bb21","Type":"ContainerDied","Data":"d7c0cb40d4a5c60fec3c390d01fb5fd951c2508f289d247ce0a5994dea4077d3"} Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.205150 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cf59db8d9-lkrks" Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.241298 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.254560 4755 scope.go:117] "RemoveContainer" containerID="4640484a7e478c0816293433342039390f68862a473fc60b302a0d49d023e860" Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.266645 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6dc587f546-7lvlg"] Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.276994 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.299732 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7cf59db8d9-lkrks"] Mar 17 00:46:08 crc kubenswrapper[4755]: I0317 00:46:08.742614 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:09 crc kubenswrapper[4755]: E0317 00:46:09.080647 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:46:09 crc kubenswrapper[4755]: E0317 00:46:09.083753 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:46:09 crc kubenswrapper[4755]: E0317 00:46:09.085007 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:46:09 crc kubenswrapper[4755]: E0317 00:46:09.085034 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5db85cdfc7-s8bqf" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerName="heat-engine" Mar 17 00:46:09 crc kubenswrapper[4755]: I0317 00:46:09.231245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerStarted","Data":"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376"} Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.261283 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" path="/var/lib/kubelet/pods/6b7b092f-96bf-466c-a3ef-1867c502bb21/volumes" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.262011 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" path="/var/lib/kubelet/pods/e062269a-a4d6-43b8-b065-6d1694b386f8/volumes" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307272 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:10 crc kubenswrapper[4755]: E0317 00:46:10.307769 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4650d5-3d6d-4437-9b7b-f585de970b8f" containerName="oc" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307792 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4650d5-3d6d-4437-9b7b-f585de970b8f" containerName="oc" Mar 17 00:46:10 crc kubenswrapper[4755]: E0317 00:46:10.307836 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307845 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: E0317 00:46:10.307855 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307862 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: E0317 00:46:10.307870 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307877 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: E0317 00:46:10.307884 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.307890 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.308132 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4650d5-3d6d-4437-9b7b-f585de970b8f" containerName="oc" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.308149 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.308161 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.308587 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7b092f-96bf-466c-a3ef-1867c502bb21" containerName="heat-cfnapi" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.308602 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e062269a-a4d6-43b8-b065-6d1694b386f8" containerName="heat-api" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.309654 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.322185 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.401223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.401444 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmpj\" (UniqueName: \"kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.401540 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.503678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.503799 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.503825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmpj\" (UniqueName: \"kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.504201 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.504480 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.542658 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmpj\" (UniqueName: \"kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj\") pod \"redhat-marketplace-7mtrq\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:10 crc kubenswrapper[4755]: I0317 00:46:10.784854 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:14 crc kubenswrapper[4755]: I0317 00:46:14.307388 4755 generic.go:334] "Generic (PLEG): container finished" podID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerID="6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" exitCode=0 Mar 17 00:46:14 crc kubenswrapper[4755]: I0317 00:46:14.307467 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5db85cdfc7-s8bqf" event={"ID":"da85eed8-63ff-4f04-aefa-13d60b8606f8","Type":"ContainerDied","Data":"6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38"} Mar 17 00:46:16 crc kubenswrapper[4755]: I0317 00:46:16.938838 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.074959 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom\") pod \"da85eed8-63ff-4f04-aefa-13d60b8606f8\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.075035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data\") pod \"da85eed8-63ff-4f04-aefa-13d60b8606f8\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.075227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle\") pod \"da85eed8-63ff-4f04-aefa-13d60b8606f8\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.075297 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl9m2\" (UniqueName: \"kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2\") pod \"da85eed8-63ff-4f04-aefa-13d60b8606f8\" (UID: \"da85eed8-63ff-4f04-aefa-13d60b8606f8\") " Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.079879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da85eed8-63ff-4f04-aefa-13d60b8606f8" (UID: "da85eed8-63ff-4f04-aefa-13d60b8606f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.084553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.099070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2" (OuterVolumeSpecName: "kube-api-access-tl9m2") pod "da85eed8-63ff-4f04-aefa-13d60b8606f8" (UID: "da85eed8-63ff-4f04-aefa-13d60b8606f8"). InnerVolumeSpecName "kube-api-access-tl9m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:17 crc kubenswrapper[4755]: W0317 00:46:17.101050 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9510ee8_4778_45d3_b000_011aa314ed40.slice/crio-f85cacb3bb668e9e5c438c0e9e36298056a6d843dfa97e78e9fc0a14b6b7c1ec WatchSource:0}: Error finding container f85cacb3bb668e9e5c438c0e9e36298056a6d843dfa97e78e9fc0a14b6b7c1ec: Status 404 returned error can't find the container with id f85cacb3bb668e9e5c438c0e9e36298056a6d843dfa97e78e9fc0a14b6b7c1ec Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.117509 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da85eed8-63ff-4f04-aefa-13d60b8606f8" (UID: "da85eed8-63ff-4f04-aefa-13d60b8606f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.158426 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data" (OuterVolumeSpecName: "config-data") pod "da85eed8-63ff-4f04-aefa-13d60b8606f8" (UID: "da85eed8-63ff-4f04-aefa-13d60b8606f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.176996 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.177024 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.177034 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da85eed8-63ff-4f04-aefa-13d60b8606f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.177043 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl9m2\" (UniqueName: \"kubernetes.io/projected/da85eed8-63ff-4f04-aefa-13d60b8606f8-kube-api-access-tl9m2\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.336051 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" event={"ID":"cce90939-ded2-4efa-90c9-c74df06b5bcd","Type":"ContainerStarted","Data":"eb81201a2d160a3ff493a22cc87b2550a2f6e68915360659a932d17c4afdc5b8"} Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342619 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerStarted","Data":"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038"} Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342778 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-central-agent" containerID="cri-o://2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046" gracePeriod=30 Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342834 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342799 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="sg-core" containerID="cri-o://c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376" gracePeriod=30 Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342866 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="proxy-httpd" containerID="cri-o://9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038" gracePeriod=30 Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.342866 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-notification-agent" containerID="cri-o://11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779" gracePeriod=30 Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.345801 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9510ee8-4778-45d3-b000-011aa314ed40" containerID="5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad" exitCode=0 Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.345870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerDied","Data":"5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad"} Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.345898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerStarted","Data":"f85cacb3bb668e9e5c438c0e9e36298056a6d843dfa97e78e9fc0a14b6b7c1ec"} Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.350299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5db85cdfc7-s8bqf" event={"ID":"da85eed8-63ff-4f04-aefa-13d60b8606f8","Type":"ContainerDied","Data":"c1dc00644b048b6411c18bf7a24c75cace3c54618859771b3282fb6dfcbdabc2"} Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.350368 4755 scope.go:117] "RemoveContainer" containerID="6c857196a128f5af16c439a2ae363c2d11e9cc036dd164bc1724821a0c2bcc38" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.350622 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5db85cdfc7-s8bqf" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.365641 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" podStartSLOduration=2.6266443710000003 podStartE2EDuration="16.365618783s" podCreationTimestamp="2026-03-17 00:46:01 +0000 UTC" firstStartedPulling="2026-03-17 00:46:02.909781132 +0000 UTC m=+1437.669233415" lastFinishedPulling="2026-03-17 00:46:16.648755544 +0000 UTC m=+1451.408207827" observedRunningTime="2026-03-17 00:46:17.358678407 +0000 UTC m=+1452.118130700" watchObservedRunningTime="2026-03-17 00:46:17.365618783 +0000 UTC m=+1452.125071066" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.429404 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.899455981 podStartE2EDuration="12.429383401s" podCreationTimestamp="2026-03-17 00:46:05 +0000 UTC" firstStartedPulling="2026-03-17 00:46:06.122189555 +0000 UTC m=+1440.881641838" lastFinishedPulling="2026-03-17 00:46:16.652116985 +0000 UTC m=+1451.411569258" observedRunningTime="2026-03-17 00:46:17.409573161 +0000 UTC m=+1452.169025444" watchObservedRunningTime="2026-03-17 00:46:17.429383401 +0000 UTC m=+1452.188835674" Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.440724 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:46:17 crc kubenswrapper[4755]: I0317 00:46:17.448582 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5db85cdfc7-s8bqf"] Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.260732 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" path="/var/lib/kubelet/pods/da85eed8-63ff-4f04-aefa-13d60b8606f8/volumes" Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.361673 4755 generic.go:334] "Generic (PLEG): container finished" podID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerID="9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038" exitCode=0 Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.361706 4755 generic.go:334] "Generic (PLEG): container finished" podID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerID="c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376" exitCode=2 Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.361724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerDied","Data":"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038"} Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.361791 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerDied","Data":"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376"} Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.365675 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9510ee8-4778-45d3-b000-011aa314ed40" containerID="42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9" exitCode=0 Mar 17 00:46:18 crc kubenswrapper[4755]: I0317 00:46:18.365736 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerDied","Data":"42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9"} Mar 17 00:46:19 crc kubenswrapper[4755]: I0317 00:46:19.376469 4755 generic.go:334] "Generic (PLEG): container finished" podID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerID="11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779" exitCode=0 Mar 17 00:46:19 crc kubenswrapper[4755]: I0317 00:46:19.376541 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerDied","Data":"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779"} Mar 17 00:46:19 crc kubenswrapper[4755]: I0317 00:46:19.379090 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerStarted","Data":"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa"} Mar 17 00:46:19 crc kubenswrapper[4755]: I0317 00:46:19.405179 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7mtrq" podStartSLOduration=7.840044589 podStartE2EDuration="9.405161126s" podCreationTimestamp="2026-03-17 00:46:10 +0000 UTC" firstStartedPulling="2026-03-17 00:46:17.348084984 +0000 UTC m=+1452.107537277" lastFinishedPulling="2026-03-17 00:46:18.913201541 +0000 UTC m=+1453.672653814" observedRunningTime="2026-03-17 00:46:19.39709573 +0000 UTC m=+1454.156548013" watchObservedRunningTime="2026-03-17 00:46:19.405161126 +0000 UTC m=+1454.164613409" Mar 17 00:46:20 crc kubenswrapper[4755]: I0317 00:46:20.785532 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:20 crc kubenswrapper[4755]: I0317 00:46:20.785949 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.058652 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7db39fbb_23c2_4f76_8bc1_1dd24245a9de.slice/crio-2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7db39fbb_23c2_4f76_8bc1_1dd24245a9de.slice/crio-conmon-2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.395698 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.402629 4755 generic.go:334] "Generic (PLEG): container finished" podID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerID="2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046" exitCode=0 Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.404044 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.404483 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerDied","Data":"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046"} Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.404651 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7db39fbb-23c2-4f76-8bc1-1dd24245a9de","Type":"ContainerDied","Data":"04e8415d434c11ed8c8be70d41b27974208736e9458fd9967d2a5eed436967ca"} Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.404757 4755 scope.go:117] "RemoveContainer" containerID="9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.448714 4755 scope.go:117] "RemoveContainer" containerID="c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.459457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.459916 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgsvg\" (UniqueName: \"kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.460181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.460288 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.460516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.461238 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.461909 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml\") pod \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\" (UID: \"7db39fbb-23c2-4f76-8bc1-1dd24245a9de\") " Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.461064 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.461156 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.463194 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.463298 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.465081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts" (OuterVolumeSpecName: "scripts") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.474556 4755 scope.go:117] "RemoveContainer" containerID="11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.474631 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg" (OuterVolumeSpecName: "kube-api-access-zgsvg") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "kube-api-access-zgsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.511288 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.565449 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.565687 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.566186 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgsvg\" (UniqueName: \"kubernetes.io/projected/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-kube-api-access-zgsvg\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.566837 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.587276 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data" (OuterVolumeSpecName: "config-data") pod "7db39fbb-23c2-4f76-8bc1-1dd24245a9de" (UID: "7db39fbb-23c2-4f76-8bc1-1dd24245a9de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.604067 4755 scope.go:117] "RemoveContainer" containerID="2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.627670 4755 scope.go:117] "RemoveContainer" containerID="9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.628558 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038\": container with ID starting with 9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038 not found: ID does not exist" containerID="9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.628747 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038"} err="failed to get container status \"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038\": rpc error: code = NotFound desc = could not find container \"9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038\": container with ID starting with 9c8679b1db3c00c9b647202fd8fb1261b737eeb61388fdcce192ab837fd2f038 not found: ID does not exist" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.628849 4755 scope.go:117] "RemoveContainer" containerID="c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.629373 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376\": container with ID starting with c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376 not found: ID does not exist" containerID="c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.629410 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376"} err="failed to get container status \"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376\": rpc error: code = NotFound desc = could not find container \"c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376\": container with ID starting with c3f12cbff9876de846e3f016667053280b051c805c21c33fbfa9973d2701b376 not found: ID does not exist" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.629448 4755 scope.go:117] "RemoveContainer" containerID="11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.629812 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779\": container with ID starting with 11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779 not found: ID does not exist" containerID="11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.629845 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779"} err="failed to get container status \"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779\": rpc error: code = NotFound desc = could not find container \"11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779\": container with ID starting with 11979e54d7fcdcc93382325b271bc5c38a1405b4d923ac0148d4c79cb67e2779 not found: ID does not exist" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.629869 4755 scope.go:117] "RemoveContainer" containerID="2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.630173 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046\": container with ID starting with 2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046 not found: ID does not exist" containerID="2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.630215 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046"} err="failed to get container status \"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046\": rpc error: code = NotFound desc = could not find container \"2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046\": container with ID starting with 2e5fbe8de12ab3aaeb3d37c363bdc1cdae85ec8cd685f90cd899f0739ce2c046 not found: ID does not exist" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.668499 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.668720 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db39fbb-23c2-4f76-8bc1-1dd24245a9de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.741338 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.752360 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766277 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.766760 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="sg-core" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766778 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="sg-core" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.766799 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerName="heat-engine" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766809 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerName="heat-engine" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.766843 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-notification-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766854 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-notification-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.766866 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-central-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766873 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-central-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: E0317 00:46:21.766893 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="proxy-httpd" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.766902 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="proxy-httpd" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.767139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-central-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.767159 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="sg-core" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.767174 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="da85eed8-63ff-4f04-aefa-13d60b8606f8" containerName="heat-engine" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.767232 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="proxy-httpd" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.767248 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" containerName="ceilometer-notification-agent" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.770846 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.780873 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.799972 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.800663 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.865270 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7mtrq" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" probeResult="failure" output=< Mar 17 00:46:21 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:46:21 crc kubenswrapper[4755]: > Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.871729 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.871762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.871868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xmx\" (UniqueName: \"kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.871915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.871997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.872033 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.872112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.973962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974032 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xmx\" (UniqueName: \"kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974177 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.974955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.975112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.979715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.980382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.980483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.981064 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:21 crc kubenswrapper[4755]: I0317 00:46:21.991902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xmx\" (UniqueName: \"kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx\") pod \"ceilometer-0\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " pod="openstack/ceilometer-0" Mar 17 00:46:22 crc kubenswrapper[4755]: I0317 00:46:22.128660 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:22 crc kubenswrapper[4755]: I0317 00:46:22.273302 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db39fbb-23c2-4f76-8bc1-1dd24245a9de" path="/var/lib/kubelet/pods/7db39fbb-23c2-4f76-8bc1-1dd24245a9de/volumes" Mar 17 00:46:22 crc kubenswrapper[4755]: I0317 00:46:22.600786 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:22 crc kubenswrapper[4755]: W0317 00:46:22.612848 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99c647d_bfeb_423e_8e15_7dc86228e252.slice/crio-7ce7601010f970d0e86fac140902706aa353bbe2fcc2806cf6482baa7476382f WatchSource:0}: Error finding container 7ce7601010f970d0e86fac140902706aa353bbe2fcc2806cf6482baa7476382f: Status 404 returned error can't find the container with id 7ce7601010f970d0e86fac140902706aa353bbe2fcc2806cf6482baa7476382f Mar 17 00:46:23 crc kubenswrapper[4755]: I0317 00:46:23.435467 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerStarted","Data":"7ce7601010f970d0e86fac140902706aa353bbe2fcc2806cf6482baa7476382f"} Mar 17 00:46:24 crc kubenswrapper[4755]: I0317 00:46:24.446552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerStarted","Data":"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087"} Mar 17 00:46:24 crc kubenswrapper[4755]: I0317 00:46:24.446975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerStarted","Data":"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554"} Mar 17 00:46:25 crc kubenswrapper[4755]: I0317 00:46:25.458156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerStarted","Data":"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52"} Mar 17 00:46:27 crc kubenswrapper[4755]: I0317 00:46:27.489238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerStarted","Data":"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d"} Mar 17 00:46:27 crc kubenswrapper[4755]: I0317 00:46:27.490590 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:46:27 crc kubenswrapper[4755]: I0317 00:46:27.513667 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.739515688 podStartE2EDuration="6.513649655s" podCreationTimestamp="2026-03-17 00:46:21 +0000 UTC" firstStartedPulling="2026-03-17 00:46:22.616775259 +0000 UTC m=+1457.376227572" lastFinishedPulling="2026-03-17 00:46:26.390909266 +0000 UTC m=+1461.150361539" observedRunningTime="2026-03-17 00:46:27.509992878 +0000 UTC m=+1462.269445161" watchObservedRunningTime="2026-03-17 00:46:27.513649655 +0000 UTC m=+1462.273101938" Mar 17 00:46:28 crc kubenswrapper[4755]: I0317 00:46:28.666235 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:46:28 crc kubenswrapper[4755]: I0317 00:46:28.666628 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:46:28 crc kubenswrapper[4755]: I0317 00:46:28.666704 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:46:28 crc kubenswrapper[4755]: I0317 00:46:28.667863 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:46:28 crc kubenswrapper[4755]: I0317 00:46:28.667957 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f" gracePeriod=600 Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.514502 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f" exitCode=0 Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.514652 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f"} Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.515211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834"} Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.515246 4755 scope.go:117] "RemoveContainer" containerID="4a1229170ff8c8c816ffd37af35946e3078d6ce31d139ae04a790479f60fedd5" Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.519728 4755 generic.go:334] "Generic (PLEG): container finished" podID="cce90939-ded2-4efa-90c9-c74df06b5bcd" containerID="eb81201a2d160a3ff493a22cc87b2550a2f6e68915360659a932d17c4afdc5b8" exitCode=0 Mar 17 00:46:29 crc kubenswrapper[4755]: I0317 00:46:29.519766 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" event={"ID":"cce90939-ded2-4efa-90c9-c74df06b5bcd","Type":"ContainerDied","Data":"eb81201a2d160a3ff493a22cc87b2550a2f6e68915360659a932d17c4afdc5b8"} Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.905784 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.976919 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts\") pod \"cce90939-ded2-4efa-90c9-c74df06b5bcd\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.977043 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-745rm\" (UniqueName: \"kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm\") pod \"cce90939-ded2-4efa-90c9-c74df06b5bcd\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.977073 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data\") pod \"cce90939-ded2-4efa-90c9-c74df06b5bcd\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.977141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle\") pod \"cce90939-ded2-4efa-90c9-c74df06b5bcd\" (UID: \"cce90939-ded2-4efa-90c9-c74df06b5bcd\") " Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.983555 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts" (OuterVolumeSpecName: "scripts") pod "cce90939-ded2-4efa-90c9-c74df06b5bcd" (UID: "cce90939-ded2-4efa-90c9-c74df06b5bcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:30 crc kubenswrapper[4755]: I0317 00:46:30.984698 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm" (OuterVolumeSpecName: "kube-api-access-745rm") pod "cce90939-ded2-4efa-90c9-c74df06b5bcd" (UID: "cce90939-ded2-4efa-90c9-c74df06b5bcd"). InnerVolumeSpecName "kube-api-access-745rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.011521 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cce90939-ded2-4efa-90c9-c74df06b5bcd" (UID: "cce90939-ded2-4efa-90c9-c74df06b5bcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.012072 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data" (OuterVolumeSpecName: "config-data") pod "cce90939-ded2-4efa-90c9-c74df06b5bcd" (UID: "cce90939-ded2-4efa-90c9-c74df06b5bcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.079836 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.079864 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.079873 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-745rm\" (UniqueName: \"kubernetes.io/projected/cce90939-ded2-4efa-90c9-c74df06b5bcd-kube-api-access-745rm\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.079883 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce90939-ded2-4efa-90c9-c74df06b5bcd-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.545220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" event={"ID":"cce90939-ded2-4efa-90c9-c74df06b5bcd","Type":"ContainerDied","Data":"07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776"} Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.545526 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07409b3dbb60c122a4aaa96e4424fece28650034afef1e269fc198ed06980776" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.545600 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r6dbj" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.714742 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 00:46:31 crc kubenswrapper[4755]: E0317 00:46:31.715189 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce90939-ded2-4efa-90c9-c74df06b5bcd" containerName="nova-cell0-conductor-db-sync" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.715207 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce90939-ded2-4efa-90c9-c74df06b5bcd" containerName="nova-cell0-conductor-db-sync" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.715429 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce90939-ded2-4efa-90c9-c74df06b5bcd" containerName="nova-cell0-conductor-db-sync" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.718216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.724695 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.730092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ls4gf" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.730226 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.792728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.792788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.792884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp94g\" (UniqueName: \"kubernetes.io/projected/49f91949-f009-4181-94d6-c07e2c7cc7fc-kube-api-access-gp94g\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.868036 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7mtrq" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" probeResult="failure" output=< Mar 17 00:46:31 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:46:31 crc kubenswrapper[4755]: > Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.894787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp94g\" (UniqueName: \"kubernetes.io/projected/49f91949-f009-4181-94d6-c07e2c7cc7fc-kube-api-access-gp94g\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.894962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.895000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.900850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.901418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f91949-f009-4181-94d6-c07e2c7cc7fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:31 crc kubenswrapper[4755]: I0317 00:46:31.910941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp94g\" (UniqueName: \"kubernetes.io/projected/49f91949-f009-4181-94d6-c07e2c7cc7fc-kube-api-access-gp94g\") pod \"nova-cell0-conductor-0\" (UID: \"49f91949-f009-4181-94d6-c07e2c7cc7fc\") " pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:32 crc kubenswrapper[4755]: I0317 00:46:32.069950 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:32 crc kubenswrapper[4755]: I0317 00:46:32.546757 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 00:46:32 crc kubenswrapper[4755]: I0317 00:46:32.559073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49f91949-f009-4181-94d6-c07e2c7cc7fc","Type":"ContainerStarted","Data":"907bf339b6f813cbed074d338ba8524bc07aa0cf7993dc1da59136fbdc5d0dd8"} Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.107594 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.108212 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-central-agent" containerID="cri-o://1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554" gracePeriod=30 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.108250 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="proxy-httpd" containerID="cri-o://e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d" gracePeriod=30 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.108286 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="sg-core" containerID="cri-o://5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52" gracePeriod=30 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.108336 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-notification-agent" containerID="cri-o://cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087" gracePeriod=30 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574093 4755 generic.go:334] "Generic (PLEG): container finished" podID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerID="e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d" exitCode=0 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574359 4755 generic.go:334] "Generic (PLEG): container finished" podID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerID="5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52" exitCode=2 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574369 4755 generic.go:334] "Generic (PLEG): container finished" podID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerID="1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554" exitCode=0 Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerDied","Data":"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d"} Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerDied","Data":"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52"} Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.574471 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerDied","Data":"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554"} Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.576086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"49f91949-f009-4181-94d6-c07e2c7cc7fc","Type":"ContainerStarted","Data":"76255c37c2f93e913509411cf0ae0452b0d902da81925e2f36e806505a812c03"} Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.576252 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:33 crc kubenswrapper[4755]: I0317 00:46:33.600534 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6005163209999997 podStartE2EDuration="2.600516321s" podCreationTimestamp="2026-03-17 00:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:33.595349614 +0000 UTC m=+1468.354801907" watchObservedRunningTime="2026-03-17 00:46:33.600516321 +0000 UTC m=+1468.359968604" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.427269 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.454836 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.454949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xmx\" (UniqueName: \"kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.454997 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455082 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455155 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455247 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data\") pod \"d99c647d-bfeb-423e-8e15-7dc86228e252\" (UID: \"d99c647d-bfeb-423e-8e15-7dc86228e252\") " Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.455649 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.456018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.466483 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts" (OuterVolumeSpecName: "scripts") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.471078 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx" (OuterVolumeSpecName: "kube-api-access-v8xmx") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "kube-api-access-v8xmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.487133 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.557547 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8xmx\" (UniqueName: \"kubernetes.io/projected/d99c647d-bfeb-423e-8e15-7dc86228e252-kube-api-access-v8xmx\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.557576 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d99c647d-bfeb-423e-8e15-7dc86228e252-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.557585 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.557593 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.563405 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.586467 4755 generic.go:334] "Generic (PLEG): container finished" podID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerID="cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087" exitCode=0 Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.587247 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.587695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerDied","Data":"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087"} Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.587721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d99c647d-bfeb-423e-8e15-7dc86228e252","Type":"ContainerDied","Data":"7ce7601010f970d0e86fac140902706aa353bbe2fcc2806cf6482baa7476382f"} Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.587736 4755 scope.go:117] "RemoveContainer" containerID="e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.606191 4755 scope.go:117] "RemoveContainer" containerID="5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.626514 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data" (OuterVolumeSpecName: "config-data") pod "d99c647d-bfeb-423e-8e15-7dc86228e252" (UID: "d99c647d-bfeb-423e-8e15-7dc86228e252"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.627687 4755 scope.go:117] "RemoveContainer" containerID="cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.649612 4755 scope.go:117] "RemoveContainer" containerID="1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.658974 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.658998 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d99c647d-bfeb-423e-8e15-7dc86228e252-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.675687 4755 scope.go:117] "RemoveContainer" containerID="e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.676698 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d\": container with ID starting with e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d not found: ID does not exist" containerID="e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.676738 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d"} err="failed to get container status \"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d\": rpc error: code = NotFound desc = could not find container \"e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d\": container with ID starting with e401cca8836edfc82dc9c4ae11d33be7be01be77d481c0165270d4efbcbc259d not found: ID does not exist" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.676764 4755 scope.go:117] "RemoveContainer" containerID="5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.677154 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52\": container with ID starting with 5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52 not found: ID does not exist" containerID="5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.677168 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52"} err="failed to get container status \"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52\": rpc error: code = NotFound desc = could not find container \"5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52\": container with ID starting with 5a4339cebbdbdeed501131d7f37d19be8071a78dbc65fa6328113686e8d68e52 not found: ID does not exist" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.677179 4755 scope.go:117] "RemoveContainer" containerID="cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.677481 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087\": container with ID starting with cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087 not found: ID does not exist" containerID="cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.677495 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087"} err="failed to get container status \"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087\": rpc error: code = NotFound desc = could not find container \"cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087\": container with ID starting with cc67a3db105fb7895942f57dec73db63d06b5d68258360257b1270ee4c94f087 not found: ID does not exist" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.677505 4755 scope.go:117] "RemoveContainer" containerID="1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.677687 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554\": container with ID starting with 1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554 not found: ID does not exist" containerID="1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.677704 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554"} err="failed to get container status \"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554\": rpc error: code = NotFound desc = could not find container \"1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554\": container with ID starting with 1492987eec84ad165ebfa3f0510ae282f9537b4b902effea8ff8501f10abc554 not found: ID does not exist" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.918836 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.927591 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.954895 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.955488 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-notification-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.955582 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-notification-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.955643 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="sg-core" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.955701 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="sg-core" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.955785 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="proxy-httpd" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.955845 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="proxy-httpd" Mar 17 00:46:34 crc kubenswrapper[4755]: E0317 00:46:34.955928 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-central-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.955993 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-central-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.956235 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-central-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.956309 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="sg-core" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.956366 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="ceilometer-notification-agent" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.956448 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" containerName="proxy-httpd" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.958269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.962097 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.967094 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:46:34 crc kubenswrapper[4755]: I0317 00:46:34.969991 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.065734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.065819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw9d6\" (UniqueName: \"kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.065911 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.066042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.066105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.066248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.066313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168370 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw9d6\" (UniqueName: \"kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.168633 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.169076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.169107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.173820 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.176577 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.178578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.189243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw9d6\" (UniqueName: \"kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.199190 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.322084 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:46:35 crc kubenswrapper[4755]: I0317 00:46:35.808714 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:35 crc kubenswrapper[4755]: W0317 00:46:35.808705 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26745d92_a8cf_4130_bcb4_16746023aee3.slice/crio-47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec WatchSource:0}: Error finding container 47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec: Status 404 returned error can't find the container with id 47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec Mar 17 00:46:36 crc kubenswrapper[4755]: I0317 00:46:36.267516 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99c647d-bfeb-423e-8e15-7dc86228e252" path="/var/lib/kubelet/pods/d99c647d-bfeb-423e-8e15-7dc86228e252/volumes" Mar 17 00:46:36 crc kubenswrapper[4755]: I0317 00:46:36.614769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerStarted","Data":"7e826332cbade48c3ab2ce25963b6b5a1fcbc88a4b180b4d18347b21e7c7470f"} Mar 17 00:46:36 crc kubenswrapper[4755]: I0317 00:46:36.614815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerStarted","Data":"47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec"} Mar 17 00:46:36 crc kubenswrapper[4755]: I0317 00:46:36.923816 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.113526 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.536496 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7chjf"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.537940 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.540023 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.540788 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.550642 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7chjf"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.624887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerStarted","Data":"16c06b27ecad5ac2820ea4e275e020668e8365709c4d65eb8d6f32a69df45692"} Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.655403 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.655495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.655564 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.655605 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chkc\" (UniqueName: \"kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.750737 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.751993 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.757133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.757192 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.757256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.757307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chkc\" (UniqueName: \"kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.763082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.765259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.776681 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.780662 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.817074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chkc\" (UniqueName: \"kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc\") pod \"nova-cell0-cell-mapping-7chjf\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.835095 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.855877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.860102 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.860190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.860232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.861316 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.862564 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.875646 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.905593 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.961791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.961873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lklvp\" (UniqueName: \"kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.961897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.961935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.961970 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.962004 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.973146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.974430 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:37 crc kubenswrapper[4755]: I0317 00:46:37.993982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974\") pod \"nova-scheduler-0\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.010290 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.012098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.015791 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.068790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.068831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.068871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlf4g\" (UniqueName: \"kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.068902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.068985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lklvp\" (UniqueName: \"kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.069020 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.069057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.077525 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.090356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.100088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.119314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lklvp\" (UniqueName: \"kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.186591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlf4g\" (UniqueName: \"kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.186649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.186750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.186810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.189217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.199049 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.199717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.201366 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.219908 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.229224 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.232040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlf4g\" (UniqueName: \"kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g\") pod \"nova-api-0\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.252596 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.269996 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.343969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.354070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.354107 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.365701 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.365795 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.395659 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.396189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.396298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmtj\" (UniqueName: \"kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.396492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497705 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwrfp\" (UniqueName: \"kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497895 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.497984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmtj\" (UniqueName: \"kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.498003 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.498066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.499067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.509676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.511051 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.531090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmtj\" (UniqueName: \"kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj\") pod \"nova-metadata-0\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.600263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.601621 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.606071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwrfp\" (UniqueName: \"kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.606167 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.606240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.606316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.606354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.607093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.607690 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.608913 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.610967 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.629248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwrfp\" (UniqueName: \"kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp\") pod \"dnsmasq-dns-5fbc4d444f-tph5m\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.677710 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerStarted","Data":"ba22763049125683ba70ae7ef7ea3db4980fbebcaf6172268c62d7c15428a34b"} Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.685645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.720995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:38 crc kubenswrapper[4755]: I0317 00:46:38.742627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7chjf"] Mar 17 00:46:38 crc kubenswrapper[4755]: W0317 00:46:38.779931 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7db40418_904e_4974_b8d3_f23a2cb94080.slice/crio-6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a WatchSource:0}: Error finding container 6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a: Status 404 returned error can't find the container with id 6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.046165 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.106234 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.274720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:39 crc kubenswrapper[4755]: W0317 00:46:39.419311 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0292185b_3c12_4b25_b900_f8c7c5d4346f.slice/crio-c953355d800301de4419c9e477aea700dc06d5aba66f2925f8305decb2997cfc WatchSource:0}: Error finding container c953355d800301de4419c9e477aea700dc06d5aba66f2925f8305decb2997cfc: Status 404 returned error can't find the container with id c953355d800301de4419c9e477aea700dc06d5aba66f2925f8305decb2997cfc Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.420784 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.429967 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.686819 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerStarted","Data":"ae9f4a612ef436cd5c41bb85b4a489c94f7a31560742274a203afddb8c237f00"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.688467 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerStarted","Data":"68da523a083b9401b4cb26354d244d3059e50f081bcb0ddf8f79c76453b6855b"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.690474 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerStarted","Data":"11d05b57f53c1144b26c312d310e4f0ef80460c64d2b220b256bf7ab01fa5adc"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.690536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerStarted","Data":"c953355d800301de4419c9e477aea700dc06d5aba66f2925f8305decb2997cfc"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.694065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01629a09-9d7a-410c-b4b1-789ee46439b2","Type":"ContainerStarted","Data":"18657712fc4bca0f0508622d6aef51ea22355049d3546a8c4eaa924e1579bf1c"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.700602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3d9e5fe-d749-4e3f-b058-13fda9b051ef","Type":"ContainerStarted","Data":"5537f214b54b7ca0de553d32a7ee067263faa9400c2e1edcb3c16da6873eacce"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.703400 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7chjf" event={"ID":"7db40418-904e-4974-b8d3-f23a2cb94080","Type":"ContainerStarted","Data":"50c512cdaee121322ce9a340513e51b6d250a091830bb1e38cb410c551420d75"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.703456 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7chjf" event={"ID":"7db40418-904e-4974-b8d3-f23a2cb94080","Type":"ContainerStarted","Data":"6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a"} Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.826700 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7chjf" podStartSLOduration=2.826682959 podStartE2EDuration="2.826682959s" podCreationTimestamp="2026-03-17 00:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:39.725647383 +0000 UTC m=+1474.485099666" watchObservedRunningTime="2026-03-17 00:46:39.826682959 +0000 UTC m=+1474.586135242" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.827933 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-48hw5"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.829354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.831772 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.832017 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.843862 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-48hw5"] Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.937972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.938029 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.938113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:39 crc kubenswrapper[4755]: I0317 00:46:39.938149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7vt\" (UniqueName: \"kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.041301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.041846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7vt\" (UniqueName: \"kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.042133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.042189 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.047834 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.063235 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.069933 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.070342 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7vt\" (UniqueName: \"kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt\") pod \"nova-cell1-conductor-db-sync-48hw5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.156452 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.720029 4755 generic.go:334] "Generic (PLEG): container finished" podID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerID="11d05b57f53c1144b26c312d310e4f0ef80460c64d2b220b256bf7ab01fa5adc" exitCode=0 Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.720108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerDied","Data":"11d05b57f53c1144b26c312d310e4f0ef80460c64d2b220b256bf7ab01fa5adc"} Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.720516 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.720538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerStarted","Data":"22b67742a91760d5f53eec407271a0f7e5782ab8a0ec3c43229070ad9d4004c4"} Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.771706 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" podStartSLOduration=2.771691518 podStartE2EDuration="2.771691518s" podCreationTimestamp="2026-03-17 00:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:40.749183745 +0000 UTC m=+1475.508636028" watchObservedRunningTime="2026-03-17 00:46:40.771691518 +0000 UTC m=+1475.531143801" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.786578 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-48hw5"] Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.832868 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:40 crc kubenswrapper[4755]: I0317 00:46:40.905200 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:41 crc kubenswrapper[4755]: I0317 00:46:41.515715 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:41 crc kubenswrapper[4755]: I0317 00:46:41.554111 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:41 crc kubenswrapper[4755]: I0317 00:46:41.578593 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:46:42 crc kubenswrapper[4755]: I0317 00:46:42.746123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-48hw5" event={"ID":"65a98bfe-6430-4b2c-9cc4-4287439401b5","Type":"ContainerStarted","Data":"ee44872f74ffa61ff73ded04ba5640a18fb1887a39c035e8bc63df0b819c7ea9"} Mar 17 00:46:42 crc kubenswrapper[4755]: I0317 00:46:42.746271 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7mtrq" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" containerID="cri-o://583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa" gracePeriod=2 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.581222 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.739390 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities\") pod \"e9510ee8-4778-45d3-b000-011aa314ed40\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.739682 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xmpj\" (UniqueName: \"kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj\") pod \"e9510ee8-4778-45d3-b000-011aa314ed40\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.739722 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content\") pod \"e9510ee8-4778-45d3-b000-011aa314ed40\" (UID: \"e9510ee8-4778-45d3-b000-011aa314ed40\") " Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.741407 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities" (OuterVolumeSpecName: "utilities") pod "e9510ee8-4778-45d3-b000-011aa314ed40" (UID: "e9510ee8-4778-45d3-b000-011aa314ed40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.749702 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj" (OuterVolumeSpecName: "kube-api-access-6xmpj") pod "e9510ee8-4778-45d3-b000-011aa314ed40" (UID: "e9510ee8-4778-45d3-b000-011aa314ed40"). InnerVolumeSpecName "kube-api-access-6xmpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.767951 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9510ee8-4778-45d3-b000-011aa314ed40" (UID: "e9510ee8-4778-45d3-b000-011aa314ed40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.770060 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerStarted","Data":"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.771044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerStarted","Data":"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.772896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerStarted","Data":"c51a8662f7f952dd2bd18d497e3f4cf596be3c90d7089354b558373238542cc9"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.773027 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-central-agent" containerID="cri-o://7e826332cbade48c3ab2ce25963b6b5a1fcbc88a4b180b4d18347b21e7c7470f" gracePeriod=30 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.773095 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.773369 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="proxy-httpd" containerID="cri-o://c51a8662f7f952dd2bd18d497e3f4cf596be3c90d7089354b558373238542cc9" gracePeriod=30 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.773408 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="sg-core" containerID="cri-o://ba22763049125683ba70ae7ef7ea3db4980fbebcaf6172268c62d7c15428a34b" gracePeriod=30 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.773456 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-notification-agent" containerID="cri-o://16c06b27ecad5ac2820ea4e275e020668e8365709c4d65eb8d6f32a69df45692" gracePeriod=30 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.775847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01629a09-9d7a-410c-b4b1-789ee46439b2","Type":"ContainerStarted","Data":"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.784141 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9510ee8-4778-45d3-b000-011aa314ed40" containerID="583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa" exitCode=0 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.784201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerDied","Data":"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.784226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7mtrq" event={"ID":"e9510ee8-4778-45d3-b000-011aa314ed40","Type":"ContainerDied","Data":"f85cacb3bb668e9e5c438c0e9e36298056a6d843dfa97e78e9fc0a14b6b7c1ec"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.784244 4755 scope.go:117] "RemoveContainer" containerID="583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.784352 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7mtrq" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.809088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3d9e5fe-d749-4e3f-b058-13fda9b051ef","Type":"ContainerStarted","Data":"d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac"} Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.809217 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac" gracePeriod=30 Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.826245 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.528035112 podStartE2EDuration="6.826220754s" podCreationTimestamp="2026-03-17 00:46:37 +0000 UTC" firstStartedPulling="2026-03-17 00:46:39.058553038 +0000 UTC m=+1473.818005311" lastFinishedPulling="2026-03-17 00:46:43.35673865 +0000 UTC m=+1478.116190953" observedRunningTime="2026-03-17 00:46:43.820617784 +0000 UTC m=+1478.580070067" watchObservedRunningTime="2026-03-17 00:46:43.826220754 +0000 UTC m=+1478.585673037" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.826978 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.450951001 podStartE2EDuration="9.826972573s" podCreationTimestamp="2026-03-17 00:46:34 +0000 UTC" firstStartedPulling="2026-03-17 00:46:35.812295507 +0000 UTC m=+1470.571747820" lastFinishedPulling="2026-03-17 00:46:43.188317109 +0000 UTC m=+1477.947769392" observedRunningTime="2026-03-17 00:46:43.801999335 +0000 UTC m=+1478.561451628" watchObservedRunningTime="2026-03-17 00:46:43.826972573 +0000 UTC m=+1478.586424856" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.843046 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.843066 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xmpj\" (UniqueName: \"kubernetes.io/projected/e9510ee8-4778-45d3-b000-011aa314ed40-kube-api-access-6xmpj\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.843077 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9510ee8-4778-45d3-b000-011aa314ed40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.848447 4755 scope.go:117] "RemoveContainer" containerID="42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.865658 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.883560 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7mtrq"] Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.898373 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.786304868 podStartE2EDuration="6.898352065s" podCreationTimestamp="2026-03-17 00:46:37 +0000 UTC" firstStartedPulling="2026-03-17 00:46:39.164912916 +0000 UTC m=+1473.924365199" lastFinishedPulling="2026-03-17 00:46:43.276960093 +0000 UTC m=+1478.036412396" observedRunningTime="2026-03-17 00:46:43.871759533 +0000 UTC m=+1478.631211816" watchObservedRunningTime="2026-03-17 00:46:43.898352065 +0000 UTC m=+1478.657804348" Mar 17 00:46:43 crc kubenswrapper[4755]: I0317 00:46:43.986632 4755 scope.go:117] "RemoveContainer" containerID="5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.102236 4755 scope.go:117] "RemoveContainer" containerID="583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa" Mar 17 00:46:44 crc kubenswrapper[4755]: E0317 00:46:44.107492 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa\": container with ID starting with 583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa not found: ID does not exist" containerID="583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.107533 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa"} err="failed to get container status \"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa\": rpc error: code = NotFound desc = could not find container \"583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa\": container with ID starting with 583c41a6d7fb4e27c93c6186be10a5e8610376553403edf579c26f7e04e86cfa not found: ID does not exist" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.107558 4755 scope.go:117] "RemoveContainer" containerID="42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9" Mar 17 00:46:44 crc kubenswrapper[4755]: E0317 00:46:44.108044 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9\": container with ID starting with 42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9 not found: ID does not exist" containerID="42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.108094 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9"} err="failed to get container status \"42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9\": rpc error: code = NotFound desc = could not find container \"42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9\": container with ID starting with 42d28921900fee5040cb6f3e6dfa9bbd36834367402031c8da75dca950cbe8c9 not found: ID does not exist" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.108120 4755 scope.go:117] "RemoveContainer" containerID="5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad" Mar 17 00:46:44 crc kubenswrapper[4755]: E0317 00:46:44.108538 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad\": container with ID starting with 5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad not found: ID does not exist" containerID="5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.108560 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad"} err="failed to get container status \"5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad\": rpc error: code = NotFound desc = could not find container \"5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad\": container with ID starting with 5a8b43c5c7d702b843116168ddc049ecb45d2a0c0b235dc51daf680d15ea79ad not found: ID does not exist" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.279246 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" path="/var/lib/kubelet/pods/e9510ee8-4778-45d3-b000-011aa314ed40/volumes" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819024 4755 generic.go:334] "Generic (PLEG): container finished" podID="26745d92-a8cf-4130-bcb4-16746023aee3" containerID="ba22763049125683ba70ae7ef7ea3db4980fbebcaf6172268c62d7c15428a34b" exitCode=2 Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819053 4755 generic.go:334] "Generic (PLEG): container finished" podID="26745d92-a8cf-4130-bcb4-16746023aee3" containerID="16c06b27ecad5ac2820ea4e275e020668e8365709c4d65eb8d6f32a69df45692" exitCode=0 Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819060 4755 generic.go:334] "Generic (PLEG): container finished" podID="26745d92-a8cf-4130-bcb4-16746023aee3" containerID="7e826332cbade48c3ab2ce25963b6b5a1fcbc88a4b180b4d18347b21e7c7470f" exitCode=0 Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerDied","Data":"ba22763049125683ba70ae7ef7ea3db4980fbebcaf6172268c62d7c15428a34b"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerDied","Data":"16c06b27ecad5ac2820ea4e275e020668e8365709c4d65eb8d6f32a69df45692"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.819155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerDied","Data":"7e826332cbade48c3ab2ce25963b6b5a1fcbc88a4b180b4d18347b21e7c7470f"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.820197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-48hw5" event={"ID":"65a98bfe-6430-4b2c-9cc4-4287439401b5","Type":"ContainerStarted","Data":"5a806caaf12ad40335ad10217e927d56da16c4dc530402dc6ae6eedcb44a2ce1"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.822482 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerStarted","Data":"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.822626 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-log" containerID="cri-o://38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" gracePeriod=30 Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.822917 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-metadata" containerID="cri-o://3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" gracePeriod=30 Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.827402 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerStarted","Data":"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c"} Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.840677 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-48hw5" podStartSLOduration=5.840656622 podStartE2EDuration="5.840656622s" podCreationTimestamp="2026-03-17 00:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:44.831733383 +0000 UTC m=+1479.591185666" watchObservedRunningTime="2026-03-17 00:46:44.840656622 +0000 UTC m=+1479.600108905" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.860857 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.003117077 podStartE2EDuration="6.860835793s" podCreationTimestamp="2026-03-17 00:46:38 +0000 UTC" firstStartedPulling="2026-03-17 00:46:39.427027206 +0000 UTC m=+1474.186479489" lastFinishedPulling="2026-03-17 00:46:43.284745912 +0000 UTC m=+1478.044198205" observedRunningTime="2026-03-17 00:46:44.852263923 +0000 UTC m=+1479.611716206" watchObservedRunningTime="2026-03-17 00:46:44.860835793 +0000 UTC m=+1479.620288076" Mar 17 00:46:44 crc kubenswrapper[4755]: I0317 00:46:44.885421 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.851604988 podStartE2EDuration="7.88540299s" podCreationTimestamp="2026-03-17 00:46:37 +0000 UTC" firstStartedPulling="2026-03-17 00:46:39.292725429 +0000 UTC m=+1474.052177732" lastFinishedPulling="2026-03-17 00:46:43.326523451 +0000 UTC m=+1478.085975734" observedRunningTime="2026-03-17 00:46:44.873053109 +0000 UTC m=+1479.632505392" watchObservedRunningTime="2026-03-17 00:46:44.88540299 +0000 UTC m=+1479.644855273" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.337820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.489741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle\") pod \"1e03fb62-b8f7-42d1-8d80-070537f42dca\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.489951 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hmtj\" (UniqueName: \"kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj\") pod \"1e03fb62-b8f7-42d1-8d80-070537f42dca\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.490084 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data\") pod \"1e03fb62-b8f7-42d1-8d80-070537f42dca\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.490112 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs\") pod \"1e03fb62-b8f7-42d1-8d80-070537f42dca\" (UID: \"1e03fb62-b8f7-42d1-8d80-070537f42dca\") " Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.490838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs" (OuterVolumeSpecName: "logs") pod "1e03fb62-b8f7-42d1-8d80-070537f42dca" (UID: "1e03fb62-b8f7-42d1-8d80-070537f42dca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.495625 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj" (OuterVolumeSpecName: "kube-api-access-7hmtj") pod "1e03fb62-b8f7-42d1-8d80-070537f42dca" (UID: "1e03fb62-b8f7-42d1-8d80-070537f42dca"). InnerVolumeSpecName "kube-api-access-7hmtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.534633 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data" (OuterVolumeSpecName: "config-data") pod "1e03fb62-b8f7-42d1-8d80-070537f42dca" (UID: "1e03fb62-b8f7-42d1-8d80-070537f42dca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.551884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e03fb62-b8f7-42d1-8d80-070537f42dca" (UID: "1e03fb62-b8f7-42d1-8d80-070537f42dca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.593371 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.593421 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e03fb62-b8f7-42d1-8d80-070537f42dca-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.593464 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e03fb62-b8f7-42d1-8d80-070537f42dca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.593488 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hmtj\" (UniqueName: \"kubernetes.io/projected/1e03fb62-b8f7-42d1-8d80-070537f42dca-kube-api-access-7hmtj\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854105 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerID="3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" exitCode=0 Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854544 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerID="38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" exitCode=143 Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854171 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerDied","Data":"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d"} Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854337 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerDied","Data":"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1"} Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854748 4755 scope.go:117] "RemoveContainer" containerID="3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.854773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e03fb62-b8f7-42d1-8d80-070537f42dca","Type":"ContainerDied","Data":"ae9f4a612ef436cd5c41bb85b4a489c94f7a31560742274a203afddb8c237f00"} Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.894833 4755 scope.go:117] "RemoveContainer" containerID="38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.925688 4755 scope.go:117] "RemoveContainer" containerID="3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.926156 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d\": container with ID starting with 3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d not found: ID does not exist" containerID="3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.926199 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d"} err="failed to get container status \"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d\": rpc error: code = NotFound desc = could not find container \"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d\": container with ID starting with 3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d not found: ID does not exist" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.926228 4755 scope.go:117] "RemoveContainer" containerID="38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.926552 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1\": container with ID starting with 38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1 not found: ID does not exist" containerID="38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.926578 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1"} err="failed to get container status \"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1\": rpc error: code = NotFound desc = could not find container \"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1\": container with ID starting with 38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1 not found: ID does not exist" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.926593 4755 scope.go:117] "RemoveContainer" containerID="3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.927817 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d"} err="failed to get container status \"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d\": rpc error: code = NotFound desc = could not find container \"3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d\": container with ID starting with 3a122a9d16202a9180204ea41cdbf02e45a94b835cb49d64d35c63f6ff42dc7d not found: ID does not exist" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.927849 4755 scope.go:117] "RemoveContainer" containerID="38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.928896 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1"} err="failed to get container status \"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1\": rpc error: code = NotFound desc = could not find container \"38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1\": container with ID starting with 38abdbdc15b98aeab84e91323e64c1c34cc5af670be88f27c75ac85578172ea1 not found: ID does not exist" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.932388 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.961303 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.970527 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.971004 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="extract-content" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971022 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="extract-content" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.971040 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-metadata" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971046 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-metadata" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.971066 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-log" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971072 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-log" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.971089 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971094 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" Mar 17 00:46:45 crc kubenswrapper[4755]: E0317 00:46:45.971114 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="extract-utilities" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="extract-utilities" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971295 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-log" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971314 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" containerName="nova-metadata-metadata" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.971333 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9510ee8-4778-45d3-b000-011aa314ed40" containerName="registry-server" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.972424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.975376 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.975574 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 00:46:45 crc kubenswrapper[4755]: I0317 00:46:45.980058 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.112551 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.112932 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.113134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.113244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.113344 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5hw\" (UniqueName: \"kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.216904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5hw\" (UniqueName: \"kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.217068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.217115 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.217531 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.217701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.218116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.222955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.223052 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.223656 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.254056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5hw\" (UniqueName: \"kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw\") pod \"nova-metadata-0\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.266105 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e03fb62-b8f7-42d1-8d80-070537f42dca" path="/var/lib/kubelet/pods/1e03fb62-b8f7-42d1-8d80-070537f42dca/volumes" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.290784 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.794015 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:46 crc kubenswrapper[4755]: W0317 00:46:46.807312 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79a28ad5_05cc_4f87_adc2_2f620048df70.slice/crio-44880b25eaf4dbf25bc2d897f82661444fd09659f4e3cc31223d910d9340cb7a WatchSource:0}: Error finding container 44880b25eaf4dbf25bc2d897f82661444fd09659f4e3cc31223d910d9340cb7a: Status 404 returned error can't find the container with id 44880b25eaf4dbf25bc2d897f82661444fd09659f4e3cc31223d910d9340cb7a Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.872863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerStarted","Data":"44880b25eaf4dbf25bc2d897f82661444fd09659f4e3cc31223d910d9340cb7a"} Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.875199 4755 generic.go:334] "Generic (PLEG): container finished" podID="7db40418-904e-4974-b8d3-f23a2cb94080" containerID="50c512cdaee121322ce9a340513e51b6d250a091830bb1e38cb410c551420d75" exitCode=0 Mar 17 00:46:46 crc kubenswrapper[4755]: I0317 00:46:46.875256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7chjf" event={"ID":"7db40418-904e-4974-b8d3-f23a2cb94080","Type":"ContainerDied","Data":"50c512cdaee121322ce9a340513e51b6d250a091830bb1e38cb410c551420d75"} Mar 17 00:46:47 crc kubenswrapper[4755]: I0317 00:46:47.889263 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerStarted","Data":"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230"} Mar 17 00:46:47 crc kubenswrapper[4755]: I0317 00:46:47.891790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerStarted","Data":"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a"} Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.202493 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.202521 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.233719 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.233994 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.270335 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.270314662 podStartE2EDuration="3.270314662s" podCreationTimestamp="2026-03-17 00:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:47.92261276 +0000 UTC m=+1482.682065083" watchObservedRunningTime="2026-03-17 00:46:48.270314662 +0000 UTC m=+1483.029766945" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.327024 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.344850 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.344913 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.382527 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8chkc\" (UniqueName: \"kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc\") pod \"7db40418-904e-4974-b8d3-f23a2cb94080\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.382566 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data\") pod \"7db40418-904e-4974-b8d3-f23a2cb94080\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.382665 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle\") pod \"7db40418-904e-4974-b8d3-f23a2cb94080\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.382714 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts\") pod \"7db40418-904e-4974-b8d3-f23a2cb94080\" (UID: \"7db40418-904e-4974-b8d3-f23a2cb94080\") " Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.391648 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts" (OuterVolumeSpecName: "scripts") pod "7db40418-904e-4974-b8d3-f23a2cb94080" (UID: "7db40418-904e-4974-b8d3-f23a2cb94080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.405009 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc" (OuterVolumeSpecName: "kube-api-access-8chkc") pod "7db40418-904e-4974-b8d3-f23a2cb94080" (UID: "7db40418-904e-4974-b8d3-f23a2cb94080"). InnerVolumeSpecName "kube-api-access-8chkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.421583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data" (OuterVolumeSpecName: "config-data") pod "7db40418-904e-4974-b8d3-f23a2cb94080" (UID: "7db40418-904e-4974-b8d3-f23a2cb94080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.424278 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db40418-904e-4974-b8d3-f23a2cb94080" (UID: "7db40418-904e-4974-b8d3-f23a2cb94080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.484707 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8chkc\" (UniqueName: \"kubernetes.io/projected/7db40418-904e-4974-b8d3-f23a2cb94080-kube-api-access-8chkc\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.484736 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.484745 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.484753 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db40418-904e-4974-b8d3-f23a2cb94080-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.724490 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.807792 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.807991 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="dnsmasq-dns" containerID="cri-o://8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076" gracePeriod=10 Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.911494 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7chjf" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.919391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7chjf" event={"ID":"7db40418-904e-4974-b8d3-f23a2cb94080","Type":"ContainerDied","Data":"6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a"} Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.919454 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6781010c458a75014cd7a9aab9474fa4e1f29c40b011bd7c5c4bdae80e7cee6a" Mar 17 00:46:48 crc kubenswrapper[4755]: I0317 00:46:48.992610 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.076546 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.076781 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-log" containerID="cri-o://4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e" gracePeriod=30 Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.076870 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-api" containerID="cri-o://29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c" gracePeriod=30 Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.081885 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.230:8774/\": EOF" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.081896 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.230:8774/\": EOF" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.110504 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.371708 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421569 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421729 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421797 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.421901 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v88lv\" (UniqueName: \"kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv\") pod \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\" (UID: \"d5f0edaa-22b4-4862-b0f1-c6dfef316566\") " Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.426862 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv" (OuterVolumeSpecName: "kube-api-access-v88lv") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "kube-api-access-v88lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.494966 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.496893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.501125 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.519978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config" (OuterVolumeSpecName: "config") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.520010 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.525017 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.525053 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.525068 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.525080 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.525091 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v88lv\" (UniqueName: \"kubernetes.io/projected/d5f0edaa-22b4-4862-b0f1-c6dfef316566-kube-api-access-v88lv\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.547042 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5f0edaa-22b4-4862-b0f1-c6dfef316566" (UID: "d5f0edaa-22b4-4862-b0f1-c6dfef316566"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.627342 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f0edaa-22b4-4862-b0f1-c6dfef316566-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.924011 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerID="8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076" exitCode=0 Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.924073 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.924096 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" event={"ID":"d5f0edaa-22b4-4862-b0f1-c6dfef316566","Type":"ContainerDied","Data":"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076"} Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.924485 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" event={"ID":"d5f0edaa-22b4-4862-b0f1-c6dfef316566","Type":"ContainerDied","Data":"4b6701facf5b2f39da611c94899fda873e75eba69937bee35011d3614eaa123f"} Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.924503 4755 scope.go:117] "RemoveContainer" containerID="8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.927789 4755 generic.go:334] "Generic (PLEG): container finished" podID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerID="4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e" exitCode=143 Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.927866 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerDied","Data":"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e"} Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.958674 4755 scope.go:117] "RemoveContainer" containerID="1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.979470 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.988552 4755 scope.go:117] "RemoveContainer" containerID="8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076" Mar 17 00:46:49 crc kubenswrapper[4755]: E0317 00:46:49.989018 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076\": container with ID starting with 8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076 not found: ID does not exist" containerID="8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.989096 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076"} err="failed to get container status \"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076\": rpc error: code = NotFound desc = could not find container \"8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076\": container with ID starting with 8fe1d5ee36c2a06385b80d9bf9651844f6e9b89d7d1e7b2a4ca14e109d480076 not found: ID does not exist" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.989139 4755 scope.go:117] "RemoveContainer" containerID="1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346" Mar 17 00:46:49 crc kubenswrapper[4755]: E0317 00:46:49.989524 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346\": container with ID starting with 1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346 not found: ID does not exist" containerID="1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346" Mar 17 00:46:49 crc kubenswrapper[4755]: I0317 00:46:49.989574 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346"} err="failed to get container status \"1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346\": rpc error: code = NotFound desc = could not find container \"1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346\": container with ID starting with 1e92d1fbb3bda1b51b3545486ad4f30c2727965793f6fa5dcc038007fa168346 not found: ID does not exist" Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.001193 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-xbllf"] Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.261150 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" path="/var/lib/kubelet/pods/d5f0edaa-22b4-4862-b0f1-c6dfef316566/volumes" Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.943790 4755 generic.go:334] "Generic (PLEG): container finished" podID="65a98bfe-6430-4b2c-9cc4-4287439401b5" containerID="5a806caaf12ad40335ad10217e927d56da16c4dc530402dc6ae6eedcb44a2ce1" exitCode=0 Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.943906 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-48hw5" event={"ID":"65a98bfe-6430-4b2c-9cc4-4287439401b5","Type":"ContainerDied","Data":"5a806caaf12ad40335ad10217e927d56da16c4dc530402dc6ae6eedcb44a2ce1"} Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.946803 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-log" containerID="cri-o://f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" gracePeriod=30 Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.946851 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-metadata" containerID="cri-o://91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" gracePeriod=30 Mar 17 00:46:50 crc kubenswrapper[4755]: I0317 00:46:50.947290 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="01629a09-9d7a-410c-b4b1-789ee46439b2" containerName="nova-scheduler-scheduler" containerID="cri-o://3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e" gracePeriod=30 Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.609029 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.666392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data\") pod \"79a28ad5-05cc-4f87-adc2-2f620048df70\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.666615 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz5hw\" (UniqueName: \"kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw\") pod \"79a28ad5-05cc-4f87-adc2-2f620048df70\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.666691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle\") pod \"79a28ad5-05cc-4f87-adc2-2f620048df70\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.666744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs\") pod \"79a28ad5-05cc-4f87-adc2-2f620048df70\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.666914 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs\") pod \"79a28ad5-05cc-4f87-adc2-2f620048df70\" (UID: \"79a28ad5-05cc-4f87-adc2-2f620048df70\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.667721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs" (OuterVolumeSpecName: "logs") pod "79a28ad5-05cc-4f87-adc2-2f620048df70" (UID: "79a28ad5-05cc-4f87-adc2-2f620048df70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.673692 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw" (OuterVolumeSpecName: "kube-api-access-cz5hw") pod "79a28ad5-05cc-4f87-adc2-2f620048df70" (UID: "79a28ad5-05cc-4f87-adc2-2f620048df70"). InnerVolumeSpecName "kube-api-access-cz5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.724811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data" (OuterVolumeSpecName: "config-data") pod "79a28ad5-05cc-4f87-adc2-2f620048df70" (UID: "79a28ad5-05cc-4f87-adc2-2f620048df70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.735683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79a28ad5-05cc-4f87-adc2-2f620048df70" (UID: "79a28ad5-05cc-4f87-adc2-2f620048df70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.765526 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "79a28ad5-05cc-4f87-adc2-2f620048df70" (UID: "79a28ad5-05cc-4f87-adc2-2f620048df70"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.769163 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.769219 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz5hw\" (UniqueName: \"kubernetes.io/projected/79a28ad5-05cc-4f87-adc2-2f620048df70-kube-api-access-cz5hw\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.769231 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.769240 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a28ad5-05cc-4f87-adc2-2f620048df70-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.769252 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a28ad5-05cc-4f87-adc2-2f620048df70-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.791142 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.870055 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974\") pod \"01629a09-9d7a-410c-b4b1-789ee46439b2\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.870114 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle\") pod \"01629a09-9d7a-410c-b4b1-789ee46439b2\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.870256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data\") pod \"01629a09-9d7a-410c-b4b1-789ee46439b2\" (UID: \"01629a09-9d7a-410c-b4b1-789ee46439b2\") " Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.873350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974" (OuterVolumeSpecName: "kube-api-access-g7974") pod "01629a09-9d7a-410c-b4b1-789ee46439b2" (UID: "01629a09-9d7a-410c-b4b1-789ee46439b2"). InnerVolumeSpecName "kube-api-access-g7974". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.894822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01629a09-9d7a-410c-b4b1-789ee46439b2" (UID: "01629a09-9d7a-410c-b4b1-789ee46439b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.902990 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data" (OuterVolumeSpecName: "config-data") pod "01629a09-9d7a-410c-b4b1-789ee46439b2" (UID: "01629a09-9d7a-410c-b4b1-789ee46439b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.960039 4755 generic.go:334] "Generic (PLEG): container finished" podID="01629a09-9d7a-410c-b4b1-789ee46439b2" containerID="3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e" exitCode=0 Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.960105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01629a09-9d7a-410c-b4b1-789ee46439b2","Type":"ContainerDied","Data":"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e"} Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.960134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01629a09-9d7a-410c-b4b1-789ee46439b2","Type":"ContainerDied","Data":"18657712fc4bca0f0508622d6aef51ea22355049d3546a8c4eaa924e1579bf1c"} Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.960154 4755 scope.go:117] "RemoveContainer" containerID="3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.960248 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.965803 4755 generic.go:334] "Generic (PLEG): container finished" podID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerID="91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" exitCode=0 Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.965862 4755 generic.go:334] "Generic (PLEG): container finished" podID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerID="f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" exitCode=143 Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.965854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerDied","Data":"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230"} Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.965889 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.965897 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerDied","Data":"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a"} Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.966189 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79a28ad5-05cc-4f87-adc2-2f620048df70","Type":"ContainerDied","Data":"44880b25eaf4dbf25bc2d897f82661444fd09659f4e3cc31223d910d9340cb7a"} Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.973851 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7974\" (UniqueName: \"kubernetes.io/projected/01629a09-9d7a-410c-b4b1-789ee46439b2-kube-api-access-g7974\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.974007 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.974088 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01629a09-9d7a-410c-b4b1-789ee46439b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.996009 4755 scope.go:117] "RemoveContainer" containerID="3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e" Mar 17 00:46:51 crc kubenswrapper[4755]: E0317 00:46:51.996447 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e\": container with ID starting with 3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e not found: ID does not exist" containerID="3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.996491 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e"} err="failed to get container status \"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e\": rpc error: code = NotFound desc = could not find container \"3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e\": container with ID starting with 3f5922ba5ffc43befb51b48a7283394b5dce7fde31fc6dd763ef15dd8c21668e not found: ID does not exist" Mar 17 00:46:51 crc kubenswrapper[4755]: I0317 00:46:51.996530 4755 scope.go:117] "RemoveContainer" containerID="91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.035328 4755 scope.go:117] "RemoveContainer" containerID="f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.039424 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.049922 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.061190 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.076502 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.087726 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-metadata" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089053 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-metadata" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089070 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="init" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089077 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="init" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089090 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="dnsmasq-dns" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="dnsmasq-dns" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089121 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01629a09-9d7a-410c-b4b1-789ee46439b2" containerName="nova-scheduler-scheduler" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089127 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="01629a09-9d7a-410c-b4b1-789ee46439b2" containerName="nova-scheduler-scheduler" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089139 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db40418-904e-4974-b8d3-f23a2cb94080" containerName="nova-manage" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089145 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db40418-904e-4974-b8d3-f23a2cb94080" containerName="nova-manage" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.089164 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-log" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089170 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-log" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089364 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="01629a09-9d7a-410c-b4b1-789ee46439b2" containerName="nova-scheduler-scheduler" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089383 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-log" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089396 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" containerName="nova-metadata-metadata" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089407 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="dnsmasq-dns" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.089417 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db40418-904e-4974-b8d3-f23a2cb94080" containerName="nova-manage" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.095654 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.097770 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.100735 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.114090 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.118015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.122074 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.122401 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.128998 4755 scope.go:117] "RemoveContainer" containerID="91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.129142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.129869 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230\": container with ID starting with 91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230 not found: ID does not exist" containerID="91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.129895 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230"} err="failed to get container status \"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230\": rpc error: code = NotFound desc = could not find container \"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230\": container with ID starting with 91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230 not found: ID does not exist" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.129916 4755 scope.go:117] "RemoveContainer" containerID="f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" Mar 17 00:46:52 crc kubenswrapper[4755]: E0317 00:46:52.130211 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a\": container with ID starting with f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a not found: ID does not exist" containerID="f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.130258 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a"} err="failed to get container status \"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a\": rpc error: code = NotFound desc = could not find container \"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a\": container with ID starting with f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a not found: ID does not exist" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.130290 4755 scope.go:117] "RemoveContainer" containerID="91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.131154 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230"} err="failed to get container status \"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230\": rpc error: code = NotFound desc = could not find container \"91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230\": container with ID starting with 91b964222b2a59a8e1399cea3f7d678d3317d580a3dc53461f5d2c19b873e230 not found: ID does not exist" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.131177 4755 scope.go:117] "RemoveContainer" containerID="f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.131423 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a"} err="failed to get container status \"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a\": rpc error: code = NotFound desc = could not find container \"f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a\": container with ID starting with f25ed9bfbf133790582fa30bfa49f37590fa23bcbf582196252462e014de821a not found: ID does not exist" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.267809 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01629a09-9d7a-410c-b4b1-789ee46439b2" path="/var/lib/kubelet/pods/01629a09-9d7a-410c-b4b1-789ee46439b2/volumes" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.268971 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a28ad5-05cc-4f87-adc2-2f620048df70" path="/var/lib/kubelet/pods/79a28ad5-05cc-4f87-adc2-2f620048df70/volumes" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7f8\" (UniqueName: \"kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283530 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.283969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dgk\" (UniqueName: \"kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.372544 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385032 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts\") pod \"65a98bfe-6430-4b2c-9cc4-4287439401b5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385098 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc7vt\" (UniqueName: \"kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt\") pod \"65a98bfe-6430-4b2c-9cc4-4287439401b5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385168 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data\") pod \"65a98bfe-6430-4b2c-9cc4-4287439401b5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385196 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle\") pod \"65a98bfe-6430-4b2c-9cc4-4287439401b5\" (UID: \"65a98bfe-6430-4b2c-9cc4-4287439401b5\") " Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385462 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385490 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7f8\" (UniqueName: \"kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385581 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385698 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.385771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dgk\" (UniqueName: \"kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.386101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.391627 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts" (OuterVolumeSpecName: "scripts") pod "65a98bfe-6430-4b2c-9cc4-4287439401b5" (UID: "65a98bfe-6430-4b2c-9cc4-4287439401b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.393657 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt" (OuterVolumeSpecName: "kube-api-access-mc7vt") pod "65a98bfe-6430-4b2c-9cc4-4287439401b5" (UID: "65a98bfe-6430-4b2c-9cc4-4287439401b5"). InnerVolumeSpecName "kube-api-access-mc7vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.394003 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.394717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.396336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.400582 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.401426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.403372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7f8\" (UniqueName: \"kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8\") pod \"nova-scheduler-0\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.414050 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dgk\" (UniqueName: \"kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk\") pod \"nova-metadata-0\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.421933 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.429593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a98bfe-6430-4b2c-9cc4-4287439401b5" (UID: "65a98bfe-6430-4b2c-9cc4-4287439401b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.447470 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data" (OuterVolumeSpecName: "config-data") pod "65a98bfe-6430-4b2c-9cc4-4287439401b5" (UID: "65a98bfe-6430-4b2c-9cc4-4287439401b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.447647 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.486857 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.486890 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.486899 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc7vt\" (UniqueName: \"kubernetes.io/projected/65a98bfe-6430-4b2c-9cc4-4287439401b5-kube-api-access-mc7vt\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.486909 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a98bfe-6430-4b2c-9cc4-4287439401b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.918653 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:46:52 crc kubenswrapper[4755]: W0317 00:46:52.921086 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d88d97d_8539_47c3_a55f_deb71af2acbc.slice/crio-3f0a3e9524103dfe0f04b0524ad32d44ffff7040529775fca8e0f3668f959268 WatchSource:0}: Error finding container 3f0a3e9524103dfe0f04b0524ad32d44ffff7040529775fca8e0f3668f959268: Status 404 returned error can't find the container with id 3f0a3e9524103dfe0f04b0524ad32d44ffff7040529775fca8e0f3668f959268 Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.987061 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-48hw5" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.989365 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-48hw5" event={"ID":"65a98bfe-6430-4b2c-9cc4-4287439401b5","Type":"ContainerDied","Data":"ee44872f74ffa61ff73ded04ba5640a18fb1887a39c035e8bc63df0b819c7ea9"} Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.989466 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee44872f74ffa61ff73ded04ba5640a18fb1887a39c035e8bc63df0b819c7ea9" Mar 17 00:46:52 crc kubenswrapper[4755]: I0317 00:46:52.995165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d88d97d-8539-47c3-a55f-deb71af2acbc","Type":"ContainerStarted","Data":"3f0a3e9524103dfe0f04b0524ad32d44ffff7040529775fca8e0f3668f959268"} Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.036050 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 00:46:53 crc kubenswrapper[4755]: E0317 00:46:53.037212 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a98bfe-6430-4b2c-9cc4-4287439401b5" containerName="nova-cell1-conductor-db-sync" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.037233 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a98bfe-6430-4b2c-9cc4-4287439401b5" containerName="nova-cell1-conductor-db-sync" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.037427 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a98bfe-6430-4b2c-9cc4-4287439401b5" containerName="nova-cell1-conductor-db-sync" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.038840 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.045971 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.066195 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.097929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rskp\" (UniqueName: \"kubernetes.io/projected/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-kube-api-access-9rskp\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.097993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.098093 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.100456 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.199829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rskp\" (UniqueName: \"kubernetes.io/projected/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-kube-api-access-9rskp\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.200150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.200874 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.206462 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.206710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.216123 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rskp\" (UniqueName: \"kubernetes.io/projected/e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8-kube-api-access-9rskp\") pod \"nova-cell1-conductor-0\" (UID: \"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8\") " pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.429882 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.581061 4755 scope.go:117] "RemoveContainer" containerID="4fd19af9daad9393a95e7201d9b24a025c08fa86d11bf77a7e9605b0cb2566a7" Mar 17 00:46:53 crc kubenswrapper[4755]: I0317 00:46:53.971523 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.022028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8","Type":"ContainerStarted","Data":"565816f1c71f6c4ad4f2a557d9a228b0c1c3140e9acf2830b7475c92f52227d7"} Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.024356 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d88d97d-8539-47c3-a55f-deb71af2acbc","Type":"ContainerStarted","Data":"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371"} Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.029195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerStarted","Data":"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939"} Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.029234 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerStarted","Data":"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525"} Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.029245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerStarted","Data":"a7b0f80a000eb9188ffedf9f759c2181585bed93dcf9cee8053fa8a052462552"} Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.048389 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.048368566 podStartE2EDuration="2.048368566s" podCreationTimestamp="2026-03-17 00:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:54.038542003 +0000 UTC m=+1488.797994296" watchObservedRunningTime="2026-03-17 00:46:54.048368566 +0000 UTC m=+1488.807820849" Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.067855 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.067833215 podStartE2EDuration="2.067833215s" podCreationTimestamp="2026-03-17 00:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:54.05866313 +0000 UTC m=+1488.818115413" watchObservedRunningTime="2026-03-17 00:46:54.067833215 +0000 UTC m=+1488.827285518" Mar 17 00:46:54 crc kubenswrapper[4755]: I0317 00:46:54.267481 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f6bc4c6c9-xbllf" podUID="d5f0edaa-22b4-4862-b0f1-c6dfef316566" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.203:5353: i/o timeout" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.032633 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.038626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8","Type":"ContainerStarted","Data":"d3968af29d0fff300d0b84407906464b2d60bbf9ebc3037da7ca10a2a4a530d5"} Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.038722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.041089 4755 generic.go:334] "Generic (PLEG): container finished" podID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerID="29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c" exitCode=0 Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.041145 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerDied","Data":"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c"} Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.041179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04","Type":"ContainerDied","Data":"68da523a083b9401b4cb26354d244d3059e50f081bcb0ddf8f79c76453b6855b"} Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.041203 4755 scope.go:117] "RemoveContainer" containerID="29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.041345 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.063498 4755 scope.go:117] "RemoveContainer" containerID="4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.068333 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.068316648 podStartE2EDuration="2.068316648s" podCreationTimestamp="2026-03-17 00:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:55.066396537 +0000 UTC m=+1489.825848830" watchObservedRunningTime="2026-03-17 00:46:55.068316648 +0000 UTC m=+1489.827768931" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.102923 4755 scope.go:117] "RemoveContainer" containerID="29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c" Mar 17 00:46:55 crc kubenswrapper[4755]: E0317 00:46:55.103633 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c\": container with ID starting with 29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c not found: ID does not exist" containerID="29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.103678 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c"} err="failed to get container status \"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c\": rpc error: code = NotFound desc = could not find container \"29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c\": container with ID starting with 29d5d885d3f7233cf9b756798126b975090e2702869f6c10f23a74c20c535e2c not found: ID does not exist" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.103703 4755 scope.go:117] "RemoveContainer" containerID="4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e" Mar 17 00:46:55 crc kubenswrapper[4755]: E0317 00:46:55.104058 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e\": container with ID starting with 4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e not found: ID does not exist" containerID="4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.104094 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e"} err="failed to get container status \"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e\": rpc error: code = NotFound desc = could not find container \"4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e\": container with ID starting with 4d07707f23c7f4f178eba4c56adbb239fa98e5c55483b2f90f324555e890cc8e not found: ID does not exist" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.148962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs\") pod \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.149109 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlf4g\" (UniqueName: \"kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g\") pod \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.149236 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle\") pod \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.149338 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data\") pod \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\" (UID: \"ad2f6737-e26f-401b-b6b0-1b6b37d3ee04\") " Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.150292 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs" (OuterVolumeSpecName: "logs") pod "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" (UID: "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.181758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g" (OuterVolumeSpecName: "kube-api-access-rlf4g") pod "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" (UID: "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04"). InnerVolumeSpecName "kube-api-access-rlf4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.190545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data" (OuterVolumeSpecName: "config-data") pod "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" (UID: "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.213839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" (UID: "ad2f6737-e26f-401b-b6b0-1b6b37d3ee04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.252483 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.252513 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.252525 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlf4g\" (UniqueName: \"kubernetes.io/projected/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-kube-api-access-rlf4g\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.252537 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.375068 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.398648 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.417017 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:55 crc kubenswrapper[4755]: E0317 00:46:55.417735 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-log" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.417767 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-log" Mar 17 00:46:55 crc kubenswrapper[4755]: E0317 00:46:55.417794 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-api" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.417807 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-api" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.418148 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-api" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.418207 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" containerName="nova-api-log" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.419939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.423231 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.428899 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.557877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.557943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.558022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.558117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62pfb\" (UniqueName: \"kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.652111 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-b6rgr"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.653356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660602 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wv2v\" (UniqueName: \"kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.660805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62pfb\" (UniqueName: \"kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.661117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.667859 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.668166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.677389 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-bfd0-account-create-update-85zvr"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.678649 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.681040 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.685778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-b6rgr"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.697004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62pfb\" (UniqueName: \"kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb\") pod \"nova-api-0\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.700052 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-bfd0-account-create-update-85zvr"] Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.762593 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.762755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wv2v\" (UniqueName: \"kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.762842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.762886 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vls5n\" (UniqueName: \"kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.763656 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.764330 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.783271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wv2v\" (UniqueName: \"kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v\") pod \"aodh-db-create-b6rgr\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.866244 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.866327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vls5n\" (UniqueName: \"kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.867752 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:55 crc kubenswrapper[4755]: I0317 00:46:55.883125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vls5n\" (UniqueName: \"kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n\") pod \"aodh-bfd0-account-create-update-85zvr\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.063026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.070565 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.227421 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.269034 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2f6737-e26f-401b-b6b0-1b6b37d3ee04" path="/var/lib/kubelet/pods/ad2f6737-e26f-401b-b6b0-1b6b37d3ee04/volumes" Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.593283 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-b6rgr"] Mar 17 00:46:56 crc kubenswrapper[4755]: W0317 00:46:56.662778 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecff6c47_8752_4ea4_9f9e_a6c4c2723181.slice/crio-f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab WatchSource:0}: Error finding container f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab: Status 404 returned error can't find the container with id f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab Mar 17 00:46:56 crc kubenswrapper[4755]: I0317 00:46:56.673679 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-bfd0-account-create-update-85zvr"] Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.073430 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerStarted","Data":"c2711f1830278f18151dcf972b04eff00d4147e39fb8fc88ab04045e1b4f3e53"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.073492 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerStarted","Data":"6201013dccb2192de443a73d8b09da2cee86976b26dbd98901fbab2bf95b5729"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.073502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerStarted","Data":"3706303bf15a17713ad17bdc14b6979b0cd5cf94fd57609644599cec8e332856"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.078813 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-bfd0-account-create-update-85zvr" event={"ID":"ecff6c47-8752-4ea4-9f9e-a6c4c2723181","Type":"ContainerStarted","Data":"f710b331bc4538c477848fbbdaf31b75703cf2cc9bea9ddd89c8af91ca14b0f1"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.078877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-bfd0-account-create-update-85zvr" event={"ID":"ecff6c47-8752-4ea4-9f9e-a6c4c2723181","Type":"ContainerStarted","Data":"f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.081701 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b6rgr" event={"ID":"f389a9bf-dbdd-4a73-ab7c-dc25609792a2","Type":"ContainerStarted","Data":"6fbab48bac90b935818c345d316cbcf4b4684392c5d17f98a0c16de25a6b47fb"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.081731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b6rgr" event={"ID":"f389a9bf-dbdd-4a73-ab7c-dc25609792a2","Type":"ContainerStarted","Data":"80fa90f48f9c0092be2c2cff2e97049f76fc9452343fe26ccacc954373ad33be"} Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.115873 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-bfd0-account-create-update-85zvr" podStartSLOduration=2.115857087 podStartE2EDuration="2.115857087s" podCreationTimestamp="2026-03-17 00:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:57.115541559 +0000 UTC m=+1491.874993892" watchObservedRunningTime="2026-03-17 00:46:57.115857087 +0000 UTC m=+1491.875309370" Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.130968 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.130946451 podStartE2EDuration="2.130946451s" podCreationTimestamp="2026-03-17 00:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:46:57.099798808 +0000 UTC m=+1491.859251141" watchObservedRunningTime="2026-03-17 00:46:57.130946451 +0000 UTC m=+1491.890398754" Mar 17 00:46:57 crc kubenswrapper[4755]: I0317 00:46:57.422722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.099597 4755 generic.go:334] "Generic (PLEG): container finished" podID="ecff6c47-8752-4ea4-9f9e-a6c4c2723181" containerID="f710b331bc4538c477848fbbdaf31b75703cf2cc9bea9ddd89c8af91ca14b0f1" exitCode=0 Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.099730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-bfd0-account-create-update-85zvr" event={"ID":"ecff6c47-8752-4ea4-9f9e-a6c4c2723181","Type":"ContainerDied","Data":"f710b331bc4538c477848fbbdaf31b75703cf2cc9bea9ddd89c8af91ca14b0f1"} Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.108569 4755 generic.go:334] "Generic (PLEG): container finished" podID="f389a9bf-dbdd-4a73-ab7c-dc25609792a2" containerID="6fbab48bac90b935818c345d316cbcf4b4684392c5d17f98a0c16de25a6b47fb" exitCode=0 Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.108617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b6rgr" event={"ID":"f389a9bf-dbdd-4a73-ab7c-dc25609792a2","Type":"ContainerDied","Data":"6fbab48bac90b935818c345d316cbcf4b4684392c5d17f98a0c16de25a6b47fb"} Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.639350 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.744668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts\") pod \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.744747 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wv2v\" (UniqueName: \"kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v\") pod \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\" (UID: \"f389a9bf-dbdd-4a73-ab7c-dc25609792a2\") " Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.745542 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f389a9bf-dbdd-4a73-ab7c-dc25609792a2" (UID: "f389a9bf-dbdd-4a73-ab7c-dc25609792a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.752697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v" (OuterVolumeSpecName: "kube-api-access-2wv2v") pod "f389a9bf-dbdd-4a73-ab7c-dc25609792a2" (UID: "f389a9bf-dbdd-4a73-ab7c-dc25609792a2"). InnerVolumeSpecName "kube-api-access-2wv2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.847343 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:58 crc kubenswrapper[4755]: I0317 00:46:58.847407 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wv2v\" (UniqueName: \"kubernetes.io/projected/f389a9bf-dbdd-4a73-ab7c-dc25609792a2-kube-api-access-2wv2v\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.124294 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-b6rgr" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.124297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-b6rgr" event={"ID":"f389a9bf-dbdd-4a73-ab7c-dc25609792a2","Type":"ContainerDied","Data":"80fa90f48f9c0092be2c2cff2e97049f76fc9452343fe26ccacc954373ad33be"} Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.124365 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80fa90f48f9c0092be2c2cff2e97049f76fc9452343fe26ccacc954373ad33be" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.571949 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.667696 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vls5n\" (UniqueName: \"kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n\") pod \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.667767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts\") pod \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\" (UID: \"ecff6c47-8752-4ea4-9f9e-a6c4c2723181\") " Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.668711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecff6c47-8752-4ea4-9f9e-a6c4c2723181" (UID: "ecff6c47-8752-4ea4-9f9e-a6c4c2723181"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.670549 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.672956 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n" (OuterVolumeSpecName: "kube-api-access-vls5n") pod "ecff6c47-8752-4ea4-9f9e-a6c4c2723181" (UID: "ecff6c47-8752-4ea4-9f9e-a6c4c2723181"). InnerVolumeSpecName "kube-api-access-vls5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:46:59 crc kubenswrapper[4755]: I0317 00:46:59.772957 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vls5n\" (UniqueName: \"kubernetes.io/projected/ecff6c47-8752-4ea4-9f9e-a6c4c2723181-kube-api-access-vls5n\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:00 crc kubenswrapper[4755]: I0317 00:47:00.144985 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-bfd0-account-create-update-85zvr" event={"ID":"ecff6c47-8752-4ea4-9f9e-a6c4c2723181","Type":"ContainerDied","Data":"f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab"} Mar 17 00:47:00 crc kubenswrapper[4755]: I0317 00:47:00.145120 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bbdc164b42b1fb7eab6950f8b25d28d2d1858f18819f33d007bdeb809355ab" Mar 17 00:47:00 crc kubenswrapper[4755]: I0317 00:47:00.145139 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-bfd0-account-create-update-85zvr" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.022249 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-9fw58"] Mar 17 00:47:01 crc kubenswrapper[4755]: E0317 00:47:01.023214 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecff6c47-8752-4ea4-9f9e-a6c4c2723181" containerName="mariadb-account-create-update" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.023306 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecff6c47-8752-4ea4-9f9e-a6c4c2723181" containerName="mariadb-account-create-update" Mar 17 00:47:01 crc kubenswrapper[4755]: E0317 00:47:01.023409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f389a9bf-dbdd-4a73-ab7c-dc25609792a2" containerName="mariadb-database-create" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.023427 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f389a9bf-dbdd-4a73-ab7c-dc25609792a2" containerName="mariadb-database-create" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.023920 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecff6c47-8752-4ea4-9f9e-a6c4c2723181" containerName="mariadb-account-create-update" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.023990 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f389a9bf-dbdd-4a73-ab7c-dc25609792a2" containerName="mariadb-database-create" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.025881 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.038749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-9fw58"] Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.042196 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.042609 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.042196 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5g9jb" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.043005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.212068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.212528 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kn7z\" (UniqueName: \"kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.212641 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.212739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.314733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.314869 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kn7z\" (UniqueName: \"kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.314963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.315018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.319158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.319675 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.321294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.334086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kn7z\" (UniqueName: \"kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z\") pod \"aodh-db-sync-9fw58\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.363311 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:01 crc kubenswrapper[4755]: I0317 00:47:01.825547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-9fw58"] Mar 17 00:47:02 crc kubenswrapper[4755]: I0317 00:47:02.167572 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9fw58" event={"ID":"da1f9758-4126-425e-863f-23dbf247cc32","Type":"ContainerStarted","Data":"74b68f0118132499b59d7774037882a33b822737e72b9be477ad6dbe726ce539"} Mar 17 00:47:02 crc kubenswrapper[4755]: I0317 00:47:02.422267 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 00:47:02 crc kubenswrapper[4755]: I0317 00:47:02.449034 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 00:47:02 crc kubenswrapper[4755]: I0317 00:47:02.449080 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 00:47:02 crc kubenswrapper[4755]: I0317 00:47:02.468289 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 00:47:03 crc kubenswrapper[4755]: I0317 00:47:03.211407 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 00:47:03 crc kubenswrapper[4755]: I0317 00:47:03.461509 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 17 00:47:03 crc kubenswrapper[4755]: I0317 00:47:03.478615 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:03 crc kubenswrapper[4755]: I0317 00:47:03.478675 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:05 crc kubenswrapper[4755]: I0317 00:47:05.326747 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 17 00:47:05 crc kubenswrapper[4755]: I0317 00:47:05.765461 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:05 crc kubenswrapper[4755]: I0317 00:47:05.765544 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:06 crc kubenswrapper[4755]: I0317 00:47:06.731887 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 00:47:06 crc kubenswrapper[4755]: I0317 00:47:06.847657 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.238:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:06 crc kubenswrapper[4755]: I0317 00:47:06.847743 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.238:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:07 crc kubenswrapper[4755]: I0317 00:47:07.223992 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9fw58" event={"ID":"da1f9758-4126-425e-863f-23dbf247cc32","Type":"ContainerStarted","Data":"1515bcf36bcff06f3fff0db52c2d153113ff3217ef838de25508ab97cf56f9e0"} Mar 17 00:47:07 crc kubenswrapper[4755]: I0317 00:47:07.248813 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-9fw58" podStartSLOduration=2.349257514 podStartE2EDuration="7.248794954s" podCreationTimestamp="2026-03-17 00:47:00 +0000 UTC" firstStartedPulling="2026-03-17 00:47:01.829349353 +0000 UTC m=+1496.588801626" lastFinishedPulling="2026-03-17 00:47:06.728886783 +0000 UTC m=+1501.488339066" observedRunningTime="2026-03-17 00:47:07.23814568 +0000 UTC m=+1501.997598003" watchObservedRunningTime="2026-03-17 00:47:07.248794954 +0000 UTC m=+1502.008247237" Mar 17 00:47:09 crc kubenswrapper[4755]: I0317 00:47:09.255533 4755 generic.go:334] "Generic (PLEG): container finished" podID="da1f9758-4126-425e-863f-23dbf247cc32" containerID="1515bcf36bcff06f3fff0db52c2d153113ff3217ef838de25508ab97cf56f9e0" exitCode=0 Mar 17 00:47:09 crc kubenswrapper[4755]: I0317 00:47:09.255635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9fw58" event={"ID":"da1f9758-4126-425e-863f-23dbf247cc32","Type":"ContainerDied","Data":"1515bcf36bcff06f3fff0db52c2d153113ff3217ef838de25508ab97cf56f9e0"} Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.448557 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.448606 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.748425 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.853896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kn7z\" (UniqueName: \"kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z\") pod \"da1f9758-4126-425e-863f-23dbf247cc32\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.854044 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle\") pod \"da1f9758-4126-425e-863f-23dbf247cc32\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.854155 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts\") pod \"da1f9758-4126-425e-863f-23dbf247cc32\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.854225 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data\") pod \"da1f9758-4126-425e-863f-23dbf247cc32\" (UID: \"da1f9758-4126-425e-863f-23dbf247cc32\") " Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.863672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts" (OuterVolumeSpecName: "scripts") pod "da1f9758-4126-425e-863f-23dbf247cc32" (UID: "da1f9758-4126-425e-863f-23dbf247cc32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.863681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z" (OuterVolumeSpecName: "kube-api-access-2kn7z") pod "da1f9758-4126-425e-863f-23dbf247cc32" (UID: "da1f9758-4126-425e-863f-23dbf247cc32"). InnerVolumeSpecName "kube-api-access-2kn7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.908569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1f9758-4126-425e-863f-23dbf247cc32" (UID: "da1f9758-4126-425e-863f-23dbf247cc32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.911054 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data" (OuterVolumeSpecName: "config-data") pod "da1f9758-4126-425e-863f-23dbf247cc32" (UID: "da1f9758-4126-425e-863f-23dbf247cc32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.956783 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.956824 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.956836 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kn7z\" (UniqueName: \"kubernetes.io/projected/da1f9758-4126-425e-863f-23dbf247cc32-kube-api-access-2kn7z\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:10 crc kubenswrapper[4755]: I0317 00:47:10.956854 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f9758-4126-425e-863f-23dbf247cc32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:11 crc kubenswrapper[4755]: I0317 00:47:11.287171 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9fw58" Mar 17 00:47:11 crc kubenswrapper[4755]: I0317 00:47:11.287074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9fw58" event={"ID":"da1f9758-4126-425e-863f-23dbf247cc32","Type":"ContainerDied","Data":"74b68f0118132499b59d7774037882a33b822737e72b9be477ad6dbe726ce539"} Mar 17 00:47:11 crc kubenswrapper[4755]: I0317 00:47:11.287377 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b68f0118132499b59d7774037882a33b822737e72b9be477ad6dbe726ce539" Mar 17 00:47:12 crc kubenswrapper[4755]: I0317 00:47:12.460373 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 00:47:12 crc kubenswrapper[4755]: I0317 00:47:12.461205 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 00:47:12 crc kubenswrapper[4755]: I0317 00:47:12.468612 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 00:47:12 crc kubenswrapper[4755]: I0317 00:47:12.471938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 00:47:13 crc kubenswrapper[4755]: I0317 00:47:13.765517 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 00:47:13 crc kubenswrapper[4755]: I0317 00:47:13.765576 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 00:47:14 crc kubenswrapper[4755]: E0317 00:47:14.136659 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d9e5fe_d749_4e3f_b058_13fda9b051ef.slice/crio-d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac.scope\": RecentStats: unable to find data in memory cache]" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.330178 4755 generic.go:334] "Generic (PLEG): container finished" podID="26745d92-a8cf-4130-bcb4-16746023aee3" containerID="c51a8662f7f952dd2bd18d497e3f4cf596be3c90d7089354b558373238542cc9" exitCode=137 Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.330301 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerDied","Data":"c51a8662f7f952dd2bd18d497e3f4cf596be3c90d7089354b558373238542cc9"} Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.330486 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26745d92-a8cf-4130-bcb4-16746023aee3","Type":"ContainerDied","Data":"47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec"} Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.330502 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47e54d167ac6bf322b395a286a5a0d6fdcce324f1cfbba446303362c2bbc60ec" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.332094 4755 generic.go:334] "Generic (PLEG): container finished" podID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" containerID="d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac" exitCode=137 Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.332144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3d9e5fe-d749-4e3f-b058-13fda9b051ef","Type":"ContainerDied","Data":"d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac"} Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.332203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3d9e5fe-d749-4e3f-b058-13fda9b051ef","Type":"ContainerDied","Data":"5537f214b54b7ca0de553d32a7ee067263faa9400c2e1edcb3c16da6873eacce"} Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.332230 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5537f214b54b7ca0de553d32a7ee067263faa9400c2e1edcb3c16da6873eacce" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.403869 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.408887 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577048 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle\") pod \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577259 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw9d6\" (UniqueName: \"kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lklvp\" (UniqueName: \"kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp\") pod \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577471 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.577570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.578432 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.578653 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.578712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.578750 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data\") pod \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\" (UID: \"e3d9e5fe-d749-4e3f-b058-13fda9b051ef\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.578788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd\") pod \"26745d92-a8cf-4130-bcb4-16746023aee3\" (UID: \"26745d92-a8cf-4130-bcb4-16746023aee3\") " Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.579777 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.580051 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.583037 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp" (OuterVolumeSpecName: "kube-api-access-lklvp") pod "e3d9e5fe-d749-4e3f-b058-13fda9b051ef" (UID: "e3d9e5fe-d749-4e3f-b058-13fda9b051ef"). InnerVolumeSpecName "kube-api-access-lklvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.583115 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6" (OuterVolumeSpecName: "kube-api-access-bw9d6") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "kube-api-access-bw9d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.584758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts" (OuterVolumeSpecName: "scripts") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.613292 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d9e5fe-d749-4e3f-b058-13fda9b051ef" (UID: "e3d9e5fe-d749-4e3f-b058-13fda9b051ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.616735 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.622227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data" (OuterVolumeSpecName: "config-data") pod "e3d9e5fe-d749-4e3f-b058-13fda9b051ef" (UID: "e3d9e5fe-d749-4e3f-b058-13fda9b051ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.681945 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682369 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682490 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26745d92-a8cf-4130-bcb4-16746023aee3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682578 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682656 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw9d6\" (UniqueName: \"kubernetes.io/projected/26745d92-a8cf-4130-bcb4-16746023aee3-kube-api-access-bw9d6\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682731 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lklvp\" (UniqueName: \"kubernetes.io/projected/e3d9e5fe-d749-4e3f-b058-13fda9b051ef-kube-api-access-lklvp\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.682830 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.688120 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data" (OuterVolumeSpecName: "config-data") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.715087 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26745d92-a8cf-4130-bcb4-16746023aee3" (UID: "26745d92-a8cf-4130-bcb4-16746023aee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.784714 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:14 crc kubenswrapper[4755]: I0317 00:47:14.784749 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26745d92-a8cf-4130-bcb4-16746023aee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.344908 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.344946 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.401787 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.422211 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.435402 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.467078 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.485903 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486353 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486374 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486393 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-central-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486401 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-central-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486420 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="sg-core" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486426 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="sg-core" Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486457 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="proxy-httpd" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486466 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="proxy-httpd" Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486488 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1f9758-4126-425e-863f-23dbf247cc32" containerName="aodh-db-sync" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486494 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1f9758-4126-425e-863f-23dbf247cc32" containerName="aodh-db-sync" Mar 17 00:47:15 crc kubenswrapper[4755]: E0317 00:47:15.486505 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-notification-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486511 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-notification-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486709 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="proxy-httpd" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486732 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486741 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-central-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486760 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="ceilometer-notification-agent" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486776 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" containerName="sg-core" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.486791 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1f9758-4126-425e-863f-23dbf247cc32" containerName="aodh-db-sync" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.487588 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.489730 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.490139 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.490592 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.503092 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.519868 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.524009 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.526112 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.526909 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.533523 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.605404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.605470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7llg\" (UniqueName: \"kubernetes.io/projected/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-kube-api-access-w7llg\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.605544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.605589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.605679 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.637396 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.640800 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.642696 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.643156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.643384 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5g9jb" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.646333 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.707532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.707598 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.707632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.707904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708139 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708798 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p74wl\" (UniqueName: \"kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.708937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7llg\" (UniqueName: \"kubernetes.io/projected/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-kube-api-access-w7llg\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.709024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.709098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.720466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.720497 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.720527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.720682 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.735544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7llg\" (UniqueName: \"kubernetes.io/projected/56f6f5b8-c52e-4fa6-be5b-12510ca9348d-kube-api-access-w7llg\") pod \"nova-cell1-novncproxy-0\" (UID: \"56f6f5b8-c52e-4fa6-be5b-12510ca9348d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.778845 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.785418 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.788372 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.810207 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.811841 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.811893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.811970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812039 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812094 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812129 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wdv\" (UniqueName: \"kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812152 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.812269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p74wl\" (UniqueName: \"kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.813030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.816769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.821367 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.825466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.826756 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.828289 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.839347 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p74wl\" (UniqueName: \"kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl\") pod \"ceilometer-0\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.840266 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.914043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.914282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.914397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wdv\" (UniqueName: \"kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.914496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.917881 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.917935 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.919502 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.932715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wdv\" (UniqueName: \"kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv\") pod \"aodh-0\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " pod="openstack/aodh-0" Mar 17 00:47:15 crc kubenswrapper[4755]: I0317 00:47:15.962563 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.258856 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26745d92-a8cf-4130-bcb4-16746023aee3" path="/var/lib/kubelet/pods/26745d92-a8cf-4130-bcb4-16746023aee3/volumes" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.259835 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d9e5fe-d749-4e3f-b058-13fda9b051ef" path="/var/lib/kubelet/pods/e3d9e5fe-d749-4e3f-b058-13fda9b051ef/volumes" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.364776 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.406504 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.514564 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.536181 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.585069 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.599378 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.645602 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkdq\" (UniqueName: \"kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.645862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.645951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.646080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.646173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.646256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.649516 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749517 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkdq\" (UniqueName: \"kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.749809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.750408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.750710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.750955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.751476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.751603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.771786 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkdq\" (UniqueName: \"kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq\") pod \"dnsmasq-dns-79b5d74c8c-2vqzm\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:16 crc kubenswrapper[4755]: I0317 00:47:16.879681 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.371580 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerStarted","Data":"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2"} Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.371837 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerStarted","Data":"f2996cd0cf6d5b8001c34552bd877cd6105faa11ad5c4965d25bf351fb1a21cd"} Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.373503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"56f6f5b8-c52e-4fa6-be5b-12510ca9348d","Type":"ContainerStarted","Data":"9c7090c5936a68a41e5789bc5a0a79c91d49a2b884317dff41195b6b6a3f65d0"} Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.373531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"56f6f5b8-c52e-4fa6-be5b-12510ca9348d","Type":"ContainerStarted","Data":"4f377c7d71f02fa6c2a0701bea56ebcfd51731a6203fc6007b74a97a35399212"} Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.376712 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.377341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerStarted","Data":"b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5"} Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.377371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerStarted","Data":"56eee3693547c074ab712daeca0600e0cbd5e9387732a1f55506a59635e103b9"} Mar 17 00:47:17 crc kubenswrapper[4755]: W0317 00:47:17.385055 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66de11c6_3dcd_45ac_adee_be59ac746a73.slice/crio-76d6b6ee8e802786561c000bc57323ae7ee1fd952b9055fac0a70c928e0f92d2 WatchSource:0}: Error finding container 76d6b6ee8e802786561c000bc57323ae7ee1fd952b9055fac0a70c928e0f92d2: Status 404 returned error can't find the container with id 76d6b6ee8e802786561c000bc57323ae7ee1fd952b9055fac0a70c928e0f92d2 Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.403030 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.403012341 podStartE2EDuration="2.403012341s" podCreationTimestamp="2026-03-17 00:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:17.398320425 +0000 UTC m=+1512.157772698" watchObservedRunningTime="2026-03-17 00:47:17.403012341 +0000 UTC m=+1512.162464624" Mar 17 00:47:17 crc kubenswrapper[4755]: I0317 00:47:17.830543 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:18 crc kubenswrapper[4755]: I0317 00:47:18.389187 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerStarted","Data":"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82"} Mar 17 00:47:18 crc kubenswrapper[4755]: I0317 00:47:18.391601 4755 generic.go:334] "Generic (PLEG): container finished" podID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerID="36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35" exitCode=0 Mar 17 00:47:18 crc kubenswrapper[4755]: I0317 00:47:18.392648 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" event={"ID":"66de11c6-3dcd-45ac-adee-be59ac746a73","Type":"ContainerDied","Data":"36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35"} Mar 17 00:47:18 crc kubenswrapper[4755]: I0317 00:47:18.392671 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" event={"ID":"66de11c6-3dcd-45ac-adee-be59ac746a73","Type":"ContainerStarted","Data":"76d6b6ee8e802786561c000bc57323ae7ee1fd952b9055fac0a70c928e0f92d2"} Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.240979 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.403920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerStarted","Data":"6b581c8d4169dd453c0fc861dae478b6bd95bcb61d9efc606ed88f70587aeb3c"} Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.407160 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerStarted","Data":"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629"} Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.409534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" event={"ID":"66de11c6-3dcd-45ac-adee-be59ac746a73","Type":"ContainerStarted","Data":"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad"} Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.409668 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.410021 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-log" containerID="cri-o://6201013dccb2192de443a73d8b09da2cee86976b26dbd98901fbab2bf95b5729" gracePeriod=30 Mar 17 00:47:19 crc kubenswrapper[4755]: I0317 00:47:19.410078 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-api" containerID="cri-o://c2711f1830278f18151dcf972b04eff00d4147e39fb8fc88ab04045e1b4f3e53" gracePeriod=30 Mar 17 00:47:21 crc kubenswrapper[4755]: I0317 00:47:21.060800 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:21 crc kubenswrapper[4755]: I0317 00:47:21.080569 4755 generic.go:334] "Generic (PLEG): container finished" podID="401f4ae0-abf1-4303-8f02-965826666105" containerID="6201013dccb2192de443a73d8b09da2cee86976b26dbd98901fbab2bf95b5729" exitCode=143 Mar 17 00:47:21 crc kubenswrapper[4755]: I0317 00:47:21.080661 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerDied","Data":"6201013dccb2192de443a73d8b09da2cee86976b26dbd98901fbab2bf95b5729"} Mar 17 00:47:22 crc kubenswrapper[4755]: I0317 00:47:22.100170 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerStarted","Data":"6d2f54b6949067d32222d9ad3ad0482fde08dc5f6ce7705b3555da19ad58f3bd"} Mar 17 00:47:22 crc kubenswrapper[4755]: I0317 00:47:22.324727 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" podStartSLOduration=6.324712373 podStartE2EDuration="6.324712373s" podCreationTimestamp="2026-03-17 00:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:19.449989086 +0000 UTC m=+1514.209441369" watchObservedRunningTime="2026-03-17 00:47:22.324712373 +0000 UTC m=+1517.084164656" Mar 17 00:47:22 crc kubenswrapper[4755]: I0317 00:47:22.337417 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.112093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerStarted","Data":"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757"} Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.112550 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-central-agent" containerID="cri-o://a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2" gracePeriod=30 Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.112654 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.113463 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-notification-agent" containerID="cri-o://9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82" gracePeriod=30 Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.113515 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="sg-core" containerID="cri-o://2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629" gracePeriod=30 Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.113443 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="proxy-httpd" containerID="cri-o://e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757" gracePeriod=30 Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.116820 4755 generic.go:334] "Generic (PLEG): container finished" podID="401f4ae0-abf1-4303-8f02-965826666105" containerID="c2711f1830278f18151dcf972b04eff00d4147e39fb8fc88ab04045e1b4f3e53" exitCode=0 Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.116848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerDied","Data":"c2711f1830278f18151dcf972b04eff00d4147e39fb8fc88ab04045e1b4f3e53"} Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.116867 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401f4ae0-abf1-4303-8f02-965826666105","Type":"ContainerDied","Data":"3706303bf15a17713ad17bdc14b6979b0cd5cf94fd57609644599cec8e332856"} Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.116876 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3706303bf15a17713ad17bdc14b6979b0cd5cf94fd57609644599cec8e332856" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.125411 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.131760 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.084648077 podStartE2EDuration="8.131744692s" podCreationTimestamp="2026-03-17 00:47:15 +0000 UTC" firstStartedPulling="2026-03-17 00:47:16.408229021 +0000 UTC m=+1511.167681304" lastFinishedPulling="2026-03-17 00:47:22.455325646 +0000 UTC m=+1517.214777919" observedRunningTime="2026-03-17 00:47:23.131333022 +0000 UTC m=+1517.890785315" watchObservedRunningTime="2026-03-17 00:47:23.131744692 +0000 UTC m=+1517.891196975" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.272042 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62pfb\" (UniqueName: \"kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb\") pod \"401f4ae0-abf1-4303-8f02-965826666105\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.272408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle\") pod \"401f4ae0-abf1-4303-8f02-965826666105\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.272580 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs\") pod \"401f4ae0-abf1-4303-8f02-965826666105\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.272617 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data\") pod \"401f4ae0-abf1-4303-8f02-965826666105\" (UID: \"401f4ae0-abf1-4303-8f02-965826666105\") " Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.273087 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs" (OuterVolumeSpecName: "logs") pod "401f4ae0-abf1-4303-8f02-965826666105" (UID: "401f4ae0-abf1-4303-8f02-965826666105"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.277141 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb" (OuterVolumeSpecName: "kube-api-access-62pfb") pod "401f4ae0-abf1-4303-8f02-965826666105" (UID: "401f4ae0-abf1-4303-8f02-965826666105"). InnerVolumeSpecName "kube-api-access-62pfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.320607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data" (OuterVolumeSpecName: "config-data") pod "401f4ae0-abf1-4303-8f02-965826666105" (UID: "401f4ae0-abf1-4303-8f02-965826666105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.343238 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "401f4ae0-abf1-4303-8f02-965826666105" (UID: "401f4ae0-abf1-4303-8f02-965826666105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.375739 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62pfb\" (UniqueName: \"kubernetes.io/projected/401f4ae0-abf1-4303-8f02-965826666105-kube-api-access-62pfb\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.375774 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.375784 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401f4ae0-abf1-4303-8f02-965826666105-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:23 crc kubenswrapper[4755]: I0317 00:47:23.375792 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401f4ae0-abf1-4303-8f02-965826666105-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.152684 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerStarted","Data":"2ced09580f45a693b419eeafb685780d8f4153cc4b84ea27eef328853d9ee1f0"} Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.153028 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-api" containerID="cri-o://b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5" gracePeriod=30 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.153500 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-listener" containerID="cri-o://2ced09580f45a693b419eeafb685780d8f4153cc4b84ea27eef328853d9ee1f0" gracePeriod=30 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.153542 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-notifier" containerID="cri-o://6d2f54b6949067d32222d9ad3ad0482fde08dc5f6ce7705b3555da19ad58f3bd" gracePeriod=30 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.153594 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-evaluator" containerID="cri-o://6b581c8d4169dd453c0fc861dae478b6bd95bcb61d9efc606ed88f70587aeb3c" gracePeriod=30 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166487 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerID="e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757" exitCode=0 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166528 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerID="2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629" exitCode=2 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166538 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerID="9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82" exitCode=0 Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerDied","Data":"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757"} Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166605 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerDied","Data":"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629"} Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166625 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerDied","Data":"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82"} Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.166613 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.179539 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.01928931 podStartE2EDuration="9.179504149s" podCreationTimestamp="2026-03-17 00:47:15 +0000 UTC" firstStartedPulling="2026-03-17 00:47:16.655199495 +0000 UTC m=+1511.414651778" lastFinishedPulling="2026-03-17 00:47:23.815414334 +0000 UTC m=+1518.574866617" observedRunningTime="2026-03-17 00:47:24.174950167 +0000 UTC m=+1518.934402460" watchObservedRunningTime="2026-03-17 00:47:24.179504149 +0000 UTC m=+1518.938956442" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.242231 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.260282 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.284819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:24 crc kubenswrapper[4755]: E0317 00:47:24.285262 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-api" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.285280 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-api" Mar 17 00:47:24 crc kubenswrapper[4755]: E0317 00:47:24.285307 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-log" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.285313 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-log" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.285505 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-log" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.285526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="401f4ae0-abf1-4303-8f02-965826666105" containerName="nova-api-api" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.287080 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.290150 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.290324 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.290474 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.307875 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409361 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.409414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs2sk\" (UniqueName: \"kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511030 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511128 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511158 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.511260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs2sk\" (UniqueName: \"kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.512244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.515939 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.517607 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.523369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.525030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.528723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs2sk\" (UniqueName: \"kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk\") pod \"nova-api-0\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " pod="openstack/nova-api-0" Mar 17 00:47:24 crc kubenswrapper[4755]: E0317 00:47:24.558093 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod401f4ae0_abf1_4303_8f02_965826666105.slice/crio-3706303bf15a17713ad17bdc14b6979b0cd5cf94fd57609644599cec8e332856\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72a40ce1_242d_4422_936b_9b867c51ee69.slice/crio-b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod401f4ae0_abf1_4303_8f02_965826666105.slice\": RecentStats: unable to find data in memory cache]" Mar 17 00:47:24 crc kubenswrapper[4755]: I0317 00:47:24.633023 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.193150 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205494 4755 generic.go:334] "Generic (PLEG): container finished" podID="72a40ce1-242d-4422-936b-9b867c51ee69" containerID="6d2f54b6949067d32222d9ad3ad0482fde08dc5f6ce7705b3555da19ad58f3bd" exitCode=0 Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205518 4755 generic.go:334] "Generic (PLEG): container finished" podID="72a40ce1-242d-4422-936b-9b867c51ee69" containerID="6b581c8d4169dd453c0fc861dae478b6bd95bcb61d9efc606ed88f70587aeb3c" exitCode=0 Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205526 4755 generic.go:334] "Generic (PLEG): container finished" podID="72a40ce1-242d-4422-936b-9b867c51ee69" containerID="b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5" exitCode=0 Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerDied","Data":"6d2f54b6949067d32222d9ad3ad0482fde08dc5f6ce7705b3555da19ad58f3bd"} Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205558 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerDied","Data":"6b581c8d4169dd453c0fc861dae478b6bd95bcb61d9efc606ed88f70587aeb3c"} Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.205566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerDied","Data":"b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5"} Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.770932 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.811680 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.921262 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939265 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939326 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939428 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p74wl\" (UniqueName: \"kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.939569 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle\") pod \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\" (UID: \"a6e63acd-f16a-4542-9d06-c6b3d2baceba\") " Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.940344 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.943190 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:25 crc kubenswrapper[4755]: I0317 00:47:25.963726 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl" (OuterVolumeSpecName: "kube-api-access-p74wl") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "kube-api-access-p74wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.021619 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts" (OuterVolumeSpecName: "scripts") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.042103 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.042129 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.042137 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e63acd-f16a-4542-9d06-c6b3d2baceba-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.042146 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p74wl\" (UniqueName: \"kubernetes.io/projected/a6e63acd-f16a-4542-9d06-c6b3d2baceba-kube-api-access-p74wl\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.099682 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.143918 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.157540 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.189746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data" (OuterVolumeSpecName: "config-data") pod "a6e63acd-f16a-4542-9d06-c6b3d2baceba" (UID: "a6e63acd-f16a-4542-9d06-c6b3d2baceba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.245324 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerStarted","Data":"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435"} Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.245390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerStarted","Data":"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699"} Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.245413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerStarted","Data":"6da56eeb1502e52abb3f1972c14e5f4959597fe391a67ede9d0a21f67eb8dca1"} Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.250250 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerID="a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2" exitCode=0 Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.250341 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.250517 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.250542 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e63acd-f16a-4542-9d06-c6b3d2baceba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.262255 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401f4ae0-abf1-4303-8f02-965826666105" path="/var/lib/kubelet/pods/401f4ae0-abf1-4303-8f02-965826666105/volumes" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.263361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerDied","Data":"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2"} Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.263392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e63acd-f16a-4542-9d06-c6b3d2baceba","Type":"ContainerDied","Data":"f2996cd0cf6d5b8001c34552bd877cd6105faa11ad5c4965d25bf351fb1a21cd"} Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.263411 4755 scope.go:117] "RemoveContainer" containerID="e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.274599 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.27458063 podStartE2EDuration="2.27458063s" podCreationTimestamp="2026-03-17 00:47:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:26.26486405 +0000 UTC m=+1521.024316404" watchObservedRunningTime="2026-03-17 00:47:26.27458063 +0000 UTC m=+1521.034032913" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.286326 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.300103 4755 scope.go:117] "RemoveContainer" containerID="2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.344633 4755 scope.go:117] "RemoveContainer" containerID="9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.377216 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.392447 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.411341 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.411922 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="sg-core" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.411952 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="sg-core" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.411980 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="proxy-httpd" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.411986 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="proxy-httpd" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.412002 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-central-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412009 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-central-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.412053 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-notification-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412059 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-notification-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412310 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-notification-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412350 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="proxy-httpd" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412368 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="sg-core" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.412379 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" containerName="ceilometer-central-agent" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.414414 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.416854 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.417173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.421591 4755 scope.go:117] "RemoveContainer" containerID="a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.429100 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.448559 4755 scope.go:117] "RemoveContainer" containerID="e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.449175 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757\": container with ID starting with e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757 not found: ID does not exist" containerID="e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.449219 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757"} err="failed to get container status \"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757\": rpc error: code = NotFound desc = could not find container \"e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757\": container with ID starting with e78e941a0df60eba941d0f7887580648d68ea0918a352bb24f5bba460adc2757 not found: ID does not exist" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.449244 4755 scope.go:117] "RemoveContainer" containerID="2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.449617 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629\": container with ID starting with 2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629 not found: ID does not exist" containerID="2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.449662 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629"} err="failed to get container status \"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629\": rpc error: code = NotFound desc = could not find container \"2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629\": container with ID starting with 2a64bf3e585b0f16c219d7a07b3d1a2b958837a977971a553bd8ffd37691b629 not found: ID does not exist" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.449691 4755 scope.go:117] "RemoveContainer" containerID="9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.455040 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82\": container with ID starting with 9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82 not found: ID does not exist" containerID="9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.455070 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82"} err="failed to get container status \"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82\": rpc error: code = NotFound desc = could not find container \"9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82\": container with ID starting with 9e0ea101a117afe5e29b152033f9d70cf4c279a705d1e56686be2db137fe8c82 not found: ID does not exist" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.455090 4755 scope.go:117] "RemoveContainer" containerID="a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2" Mar 17 00:47:26 crc kubenswrapper[4755]: E0317 00:47:26.455330 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2\": container with ID starting with a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2 not found: ID does not exist" containerID="a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.455372 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2"} err="failed to get container status \"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2\": rpc error: code = NotFound desc = could not find container \"a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2\": container with ID starting with a50538ba23be15362fdb0c792d03412df27994164544f5ea14528fecdb9d82f2 not found: ID does not exist" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.556792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.556874 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqmr\" (UniqueName: \"kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.556904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.556945 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.557151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.557212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.557245 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.652837 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-znj9l"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.659519 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.669150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.669489 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.670579 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.670629 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phqmr\" (UniqueName: \"kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.670883 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.671129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.671209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.671831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.671901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.671827 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.674500 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.676446 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-znj9l"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.677964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.678498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.689718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.697609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.701455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqmr\" (UniqueName: \"kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr\") pod \"ceilometer-0\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.729208 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.777850 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.777909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.777981 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.778039 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-426r2\" (UniqueName: \"kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.880081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.880151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-426r2\" (UniqueName: \"kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.880229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.880259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.881886 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.883597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.885250 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.885643 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.910977 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-426r2\" (UniqueName: \"kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2\") pod \"nova-cell1-cell-mapping-znj9l\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.950369 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.950626 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="dnsmasq-dns" containerID="cri-o://22b67742a91760d5f53eec407271a0f7e5782ab8a0ec3c43229070ad9d4004c4" gracePeriod=10 Mar 17 00:47:26 crc kubenswrapper[4755]: I0317 00:47:26.997380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.253455 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.265780 4755 generic.go:334] "Generic (PLEG): container finished" podID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerID="22b67742a91760d5f53eec407271a0f7e5782ab8a0ec3c43229070ad9d4004c4" exitCode=0 Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.265878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerDied","Data":"22b67742a91760d5f53eec407271a0f7e5782ab8a0ec3c43229070ad9d4004c4"} Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.556751 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.646057 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-znj9l"] Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729265 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729416 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729450 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwrfp\" (UniqueName: \"kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.729673 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb\") pod \"0292185b-3c12-4b25-b900-f8c7c5d4346f\" (UID: \"0292185b-3c12-4b25-b900-f8c7c5d4346f\") " Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.735724 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp" (OuterVolumeSpecName: "kube-api-access-cwrfp") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "kube-api-access-cwrfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.824733 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.825845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.826380 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.827727 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.831887 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.832190 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.832204 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.832215 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.832224 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwrfp\" (UniqueName: \"kubernetes.io/projected/0292185b-3c12-4b25-b900-f8c7c5d4346f-kube-api-access-cwrfp\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.832440 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config" (OuterVolumeSpecName: "config") pod "0292185b-3c12-4b25-b900-f8c7c5d4346f" (UID: "0292185b-3c12-4b25-b900-f8c7c5d4346f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.933824 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0292185b-3c12-4b25-b900-f8c7c5d4346f-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.996915 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:27 crc kubenswrapper[4755]: E0317 00:47:27.997327 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="init" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.997344 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="init" Mar 17 00:47:27 crc kubenswrapper[4755]: E0317 00:47:27.997363 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="dnsmasq-dns" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.997371 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="dnsmasq-dns" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.997627 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" containerName="dnsmasq-dns" Mar 17 00:47:27 crc kubenswrapper[4755]: I0317 00:47:27.999013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.030380 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.137483 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsz7q\" (UniqueName: \"kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.137572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.137640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.239936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.239427 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.240084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsz7q\" (UniqueName: \"kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.240409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.240653 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.264692 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsz7q\" (UniqueName: \"kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q\") pod \"certified-operators-6rj6j\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.265353 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e63acd-f16a-4542-9d06-c6b3d2baceba" path="/var/lib/kubelet/pods/a6e63acd-f16a-4542-9d06-c6b3d2baceba/volumes" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.280137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-znj9l" event={"ID":"47ccacd0-c90a-4280-a907-a5b43b82744d","Type":"ContainerStarted","Data":"200d1dc69bb32849dcf63b582c5b869b6a870c2e32711dffc06f46007f10389d"} Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.280208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-znj9l" event={"ID":"47ccacd0-c90a-4280-a907-a5b43b82744d","Type":"ContainerStarted","Data":"7bbfc0f7431c0ab4b0bc0f34d81a13567415e72c4323b4a6d6c4ebe7b59a0032"} Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.282571 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" event={"ID":"0292185b-3c12-4b25-b900-f8c7c5d4346f","Type":"ContainerDied","Data":"c953355d800301de4419c9e477aea700dc06d5aba66f2925f8305decb2997cfc"} Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.282610 4755 scope.go:117] "RemoveContainer" containerID="22b67742a91760d5f53eec407271a0f7e5782ab8a0ec3c43229070ad9d4004c4" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.282725 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-tph5m" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.289175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerStarted","Data":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.289210 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerStarted","Data":"ffb1f9e4e405a8a8bf18925810e40d9f03d20f437cd1e2e5f2000c2a92fc116b"} Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.316254 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-znj9l" podStartSLOduration=2.316234723 podStartE2EDuration="2.316234723s" podCreationTimestamp="2026-03-17 00:47:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:28.300963824 +0000 UTC m=+1523.060416107" watchObservedRunningTime="2026-03-17 00:47:28.316234723 +0000 UTC m=+1523.075687026" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.325182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.357398 4755 scope.go:117] "RemoveContainer" containerID="11d05b57f53c1144b26c312d310e4f0ef80460c64d2b220b256bf7ab01fa5adc" Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.374406 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.383830 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-tph5m"] Mar 17 00:47:28 crc kubenswrapper[4755]: I0317 00:47:28.844553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:28 crc kubenswrapper[4755]: W0317 00:47:28.847810 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef714ae3_d651_4583_871a_05c45a82aff7.slice/crio-0931e69c22e5ae5e0692713703f71d4051fd68ddea40c94f33d543653aaa11c5 WatchSource:0}: Error finding container 0931e69c22e5ae5e0692713703f71d4051fd68ddea40c94f33d543653aaa11c5: Status 404 returned error can't find the container with id 0931e69c22e5ae5e0692713703f71d4051fd68ddea40c94f33d543653aaa11c5 Mar 17 00:47:29 crc kubenswrapper[4755]: I0317 00:47:29.300915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerStarted","Data":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} Mar 17 00:47:29 crc kubenswrapper[4755]: I0317 00:47:29.302479 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef714ae3-d651-4583-871a-05c45a82aff7" containerID="c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8" exitCode=0 Mar 17 00:47:29 crc kubenswrapper[4755]: I0317 00:47:29.302996 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerDied","Data":"c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8"} Mar 17 00:47:29 crc kubenswrapper[4755]: I0317 00:47:29.303057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerStarted","Data":"0931e69c22e5ae5e0692713703f71d4051fd68ddea40c94f33d543653aaa11c5"} Mar 17 00:47:30 crc kubenswrapper[4755]: I0317 00:47:30.259517 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0292185b-3c12-4b25-b900-f8c7c5d4346f" path="/var/lib/kubelet/pods/0292185b-3c12-4b25-b900-f8c7c5d4346f/volumes" Mar 17 00:47:30 crc kubenswrapper[4755]: I0317 00:47:30.313554 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerStarted","Data":"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd"} Mar 17 00:47:30 crc kubenswrapper[4755]: I0317 00:47:30.315687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerStarted","Data":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} Mar 17 00:47:31 crc kubenswrapper[4755]: I0317 00:47:31.331507 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef714ae3-d651-4583-871a-05c45a82aff7" containerID="1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd" exitCode=0 Mar 17 00:47:31 crc kubenswrapper[4755]: I0317 00:47:31.331807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerDied","Data":"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd"} Mar 17 00:47:32 crc kubenswrapper[4755]: I0317 00:47:32.349809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerStarted","Data":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} Mar 17 00:47:32 crc kubenswrapper[4755]: I0317 00:47:32.350178 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:47:32 crc kubenswrapper[4755]: I0317 00:47:32.355144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerStarted","Data":"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12"} Mar 17 00:47:32 crc kubenswrapper[4755]: I0317 00:47:32.376238 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.654200251 podStartE2EDuration="6.376218614s" podCreationTimestamp="2026-03-17 00:47:26 +0000 UTC" firstStartedPulling="2026-03-17 00:47:27.337227345 +0000 UTC m=+1522.096679628" lastFinishedPulling="2026-03-17 00:47:31.059245668 +0000 UTC m=+1525.818697991" observedRunningTime="2026-03-17 00:47:32.373232814 +0000 UTC m=+1527.132685107" watchObservedRunningTime="2026-03-17 00:47:32.376218614 +0000 UTC m=+1527.135670897" Mar 17 00:47:32 crc kubenswrapper[4755]: I0317 00:47:32.406459 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rj6j" podStartSLOduration=2.813443928 podStartE2EDuration="5.406422111s" podCreationTimestamp="2026-03-17 00:47:27 +0000 UTC" firstStartedPulling="2026-03-17 00:47:29.305360992 +0000 UTC m=+1524.064813305" lastFinishedPulling="2026-03-17 00:47:31.898339205 +0000 UTC m=+1526.657791488" observedRunningTime="2026-03-17 00:47:32.390129716 +0000 UTC m=+1527.149581999" watchObservedRunningTime="2026-03-17 00:47:32.406422111 +0000 UTC m=+1527.165874424" Mar 17 00:47:33 crc kubenswrapper[4755]: I0317 00:47:33.407197 4755 generic.go:334] "Generic (PLEG): container finished" podID="47ccacd0-c90a-4280-a907-a5b43b82744d" containerID="200d1dc69bb32849dcf63b582c5b869b6a870c2e32711dffc06f46007f10389d" exitCode=0 Mar 17 00:47:33 crc kubenswrapper[4755]: I0317 00:47:33.407622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-znj9l" event={"ID":"47ccacd0-c90a-4280-a907-a5b43b82744d","Type":"ContainerDied","Data":"200d1dc69bb32849dcf63b582c5b869b6a870c2e32711dffc06f46007f10389d"} Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.633928 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.634344 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.902808 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.992692 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts\") pod \"47ccacd0-c90a-4280-a907-a5b43b82744d\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.992743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data\") pod \"47ccacd0-c90a-4280-a907-a5b43b82744d\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.992771 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle\") pod \"47ccacd0-c90a-4280-a907-a5b43b82744d\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.992830 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-426r2\" (UniqueName: \"kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2\") pod \"47ccacd0-c90a-4280-a907-a5b43b82744d\" (UID: \"47ccacd0-c90a-4280-a907-a5b43b82744d\") " Mar 17 00:47:34 crc kubenswrapper[4755]: I0317 00:47:34.998484 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2" (OuterVolumeSpecName: "kube-api-access-426r2") pod "47ccacd0-c90a-4280-a907-a5b43b82744d" (UID: "47ccacd0-c90a-4280-a907-a5b43b82744d"). InnerVolumeSpecName "kube-api-access-426r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.000547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts" (OuterVolumeSpecName: "scripts") pod "47ccacd0-c90a-4280-a907-a5b43b82744d" (UID: "47ccacd0-c90a-4280-a907-a5b43b82744d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.034934 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data" (OuterVolumeSpecName: "config-data") pod "47ccacd0-c90a-4280-a907-a5b43b82744d" (UID: "47ccacd0-c90a-4280-a907-a5b43b82744d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.047064 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47ccacd0-c90a-4280-a907-a5b43b82744d" (UID: "47ccacd0-c90a-4280-a907-a5b43b82744d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.094896 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.094958 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.094971 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-426r2\" (UniqueName: \"kubernetes.io/projected/47ccacd0-c90a-4280-a907-a5b43b82744d-kube-api-access-426r2\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.094980 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ccacd0-c90a-4280-a907-a5b43b82744d-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.429827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-znj9l" event={"ID":"47ccacd0-c90a-4280-a907-a5b43b82744d","Type":"ContainerDied","Data":"7bbfc0f7431c0ab4b0bc0f34d81a13567415e72c4323b4a6d6c4ebe7b59a0032"} Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.429879 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbfc0f7431c0ab4b0bc0f34d81a13567415e72c4323b4a6d6c4ebe7b59a0032" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.429944 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-znj9l" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.632510 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.632750 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-log" containerID="cri-o://2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699" gracePeriod=30 Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.633013 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-api" containerID="cri-o://dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435" gracePeriod=30 Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.640707 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.246:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.641632 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.246:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.652310 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.652582 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9d88d97d-8539-47c3-a55f-deb71af2acbc" containerName="nova-scheduler-scheduler" containerID="cri-o://f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371" gracePeriod=30 Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.673390 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.673910 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-log" containerID="cri-o://7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525" gracePeriod=30 Mar 17 00:47:35 crc kubenswrapper[4755]: I0317 00:47:35.673967 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-metadata" containerID="cri-o://e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939" gracePeriod=30 Mar 17 00:47:36 crc kubenswrapper[4755]: I0317 00:47:36.441753 4755 generic.go:334] "Generic (PLEG): container finished" podID="11fa943d-490d-47c8-b14d-c250dca5c388" containerID="7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525" exitCode=143 Mar 17 00:47:36 crc kubenswrapper[4755]: I0317 00:47:36.441846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerDied","Data":"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525"} Mar 17 00:47:36 crc kubenswrapper[4755]: I0317 00:47:36.443772 4755 generic.go:334] "Generic (PLEG): container finished" podID="8e152b96-8219-4f65-bc13-81007beb33b6" containerID="2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699" exitCode=143 Mar 17 00:47:36 crc kubenswrapper[4755]: I0317 00:47:36.443827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerDied","Data":"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699"} Mar 17 00:47:36 crc kubenswrapper[4755]: I0317 00:47:36.867806 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.034209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle\") pod \"9d88d97d-8539-47c3-a55f-deb71af2acbc\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.034462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk7f8\" (UniqueName: \"kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8\") pod \"9d88d97d-8539-47c3-a55f-deb71af2acbc\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.034492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data\") pod \"9d88d97d-8539-47c3-a55f-deb71af2acbc\" (UID: \"9d88d97d-8539-47c3-a55f-deb71af2acbc\") " Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.039712 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8" (OuterVolumeSpecName: "kube-api-access-hk7f8") pod "9d88d97d-8539-47c3-a55f-deb71af2acbc" (UID: "9d88d97d-8539-47c3-a55f-deb71af2acbc"). InnerVolumeSpecName "kube-api-access-hk7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.067937 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data" (OuterVolumeSpecName: "config-data") pod "9d88d97d-8539-47c3-a55f-deb71af2acbc" (UID: "9d88d97d-8539-47c3-a55f-deb71af2acbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.076613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d88d97d-8539-47c3-a55f-deb71af2acbc" (UID: "9d88d97d-8539-47c3-a55f-deb71af2acbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.136667 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.136710 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk7f8\" (UniqueName: \"kubernetes.io/projected/9d88d97d-8539-47c3-a55f-deb71af2acbc-kube-api-access-hk7f8\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.136728 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88d97d-8539-47c3-a55f-deb71af2acbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.457704 4755 generic.go:334] "Generic (PLEG): container finished" podID="9d88d97d-8539-47c3-a55f-deb71af2acbc" containerID="f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371" exitCode=0 Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.457818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d88d97d-8539-47c3-a55f-deb71af2acbc","Type":"ContainerDied","Data":"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371"} Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.458208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9d88d97d-8539-47c3-a55f-deb71af2acbc","Type":"ContainerDied","Data":"3f0a3e9524103dfe0f04b0524ad32d44ffff7040529775fca8e0f3668f959268"} Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.458274 4755 scope.go:117] "RemoveContainer" containerID="f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.457839 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.495165 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.497505 4755 scope.go:117] "RemoveContainer" containerID="f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371" Mar 17 00:47:37 crc kubenswrapper[4755]: E0317 00:47:37.497965 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371\": container with ID starting with f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371 not found: ID does not exist" containerID="f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.498007 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371"} err="failed to get container status \"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371\": rpc error: code = NotFound desc = could not find container \"f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371\": container with ID starting with f3d6a392fb267cafd5af728538159d874d2301c62ecd7f1c80dcfc9de9eab371 not found: ID does not exist" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.509969 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.525581 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:37 crc kubenswrapper[4755]: E0317 00:47:37.526103 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ccacd0-c90a-4280-a907-a5b43b82744d" containerName="nova-manage" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.526120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ccacd0-c90a-4280-a907-a5b43b82744d" containerName="nova-manage" Mar 17 00:47:37 crc kubenswrapper[4755]: E0317 00:47:37.526159 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d88d97d-8539-47c3-a55f-deb71af2acbc" containerName="nova-scheduler-scheduler" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.526166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d88d97d-8539-47c3-a55f-deb71af2acbc" containerName="nova-scheduler-scheduler" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.526429 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ccacd0-c90a-4280-a907-a5b43b82744d" containerName="nova-manage" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.526482 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d88d97d-8539-47c3-a55f-deb71af2acbc" containerName="nova-scheduler-scheduler" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.528492 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.531103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.552181 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.647385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-config-data\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.647603 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6q8\" (UniqueName: \"kubernetes.io/projected/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-kube-api-access-9r6q8\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.647755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.749978 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-config-data\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.750269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6q8\" (UniqueName: \"kubernetes.io/projected/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-kube-api-access-9r6q8\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.750421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.756998 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.764477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-config-data\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.777504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6q8\" (UniqueName: \"kubernetes.io/projected/b40ded55-f4e9-48b7-b8a6-16cda16d1c09-kube-api-access-9r6q8\") pod \"nova-scheduler-0\" (UID: \"b40ded55-f4e9-48b7-b8a6-16cda16d1c09\") " pod="openstack/nova-scheduler-0" Mar 17 00:47:37 crc kubenswrapper[4755]: I0317 00:47:37.852995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.265092 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d88d97d-8539-47c3-a55f-deb71af2acbc" path="/var/lib/kubelet/pods/9d88d97d-8539-47c3-a55f-deb71af2acbc/volumes" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.326216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.326268 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.355685 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.401104 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.472531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b40ded55-f4e9-48b7-b8a6-16cda16d1c09","Type":"ContainerStarted","Data":"b14cb8af32f10a641034f9a03db1f7fc8b04a94dda347b54439abfb7eac2d816"} Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.527547 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:38 crc kubenswrapper[4755]: I0317 00:47:38.648622 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.370722 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.485031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs\") pod \"11fa943d-490d-47c8-b14d-c250dca5c388\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.485073 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle\") pod \"11fa943d-490d-47c8-b14d-c250dca5c388\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.485129 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data\") pod \"11fa943d-490d-47c8-b14d-c250dca5c388\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.485236 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5dgk\" (UniqueName: \"kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk\") pod \"11fa943d-490d-47c8-b14d-c250dca5c388\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.485272 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs\") pod \"11fa943d-490d-47c8-b14d-c250dca5c388\" (UID: \"11fa943d-490d-47c8-b14d-c250dca5c388\") " Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.487914 4755 generic.go:334] "Generic (PLEG): container finished" podID="11fa943d-490d-47c8-b14d-c250dca5c388" containerID="e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939" exitCode=0 Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.487930 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs" (OuterVolumeSpecName: "logs") pod "11fa943d-490d-47c8-b14d-c250dca5c388" (UID: "11fa943d-490d-47c8-b14d-c250dca5c388"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.487970 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerDied","Data":"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939"} Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.488014 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11fa943d-490d-47c8-b14d-c250dca5c388","Type":"ContainerDied","Data":"a7b0f80a000eb9188ffedf9f759c2181585bed93dcf9cee8053fa8a052462552"} Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.488032 4755 scope.go:117] "RemoveContainer" containerID="e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.488167 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.500023 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk" (OuterVolumeSpecName: "kube-api-access-f5dgk") pod "11fa943d-490d-47c8-b14d-c250dca5c388" (UID: "11fa943d-490d-47c8-b14d-c250dca5c388"). InnerVolumeSpecName "kube-api-access-f5dgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.502177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b40ded55-f4e9-48b7-b8a6-16cda16d1c09","Type":"ContainerStarted","Data":"4bbef093036ed2843e3bfc0f4448341e3afbc53b0b3259d111067571b832e08d"} Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.521155 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.521137573 podStartE2EDuration="2.521137573s" podCreationTimestamp="2026-03-17 00:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:39.514753272 +0000 UTC m=+1534.274205555" watchObservedRunningTime="2026-03-17 00:47:39.521137573 +0000 UTC m=+1534.280589856" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.535536 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data" (OuterVolumeSpecName: "config-data") pod "11fa943d-490d-47c8-b14d-c250dca5c388" (UID: "11fa943d-490d-47c8-b14d-c250dca5c388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.551917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11fa943d-490d-47c8-b14d-c250dca5c388" (UID: "11fa943d-490d-47c8-b14d-c250dca5c388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.572506 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "11fa943d-490d-47c8-b14d-c250dca5c388" (UID: "11fa943d-490d-47c8-b14d-c250dca5c388"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.578556 4755 scope.go:117] "RemoveContainer" containerID="7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.593405 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5dgk\" (UniqueName: \"kubernetes.io/projected/11fa943d-490d-47c8-b14d-c250dca5c388-kube-api-access-f5dgk\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.593431 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.593456 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11fa943d-490d-47c8-b14d-c250dca5c388-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.593465 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.593492 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fa943d-490d-47c8-b14d-c250dca5c388-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.595810 4755 scope.go:117] "RemoveContainer" containerID="e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939" Mar 17 00:47:39 crc kubenswrapper[4755]: E0317 00:47:39.596248 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939\": container with ID starting with e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939 not found: ID does not exist" containerID="e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.596274 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939"} err="failed to get container status \"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939\": rpc error: code = NotFound desc = could not find container \"e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939\": container with ID starting with e5d26b5554756150e39f0def907dffd1f964d42e31c0a7c82c5e70fca120a939 not found: ID does not exist" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.596293 4755 scope.go:117] "RemoveContainer" containerID="7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525" Mar 17 00:47:39 crc kubenswrapper[4755]: E0317 00:47:39.596654 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525\": container with ID starting with 7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525 not found: ID does not exist" containerID="7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.596691 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525"} err="failed to get container status \"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525\": rpc error: code = NotFound desc = could not find container \"7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525\": container with ID starting with 7f735076080a40139ff39b79a7ff6674f810f2aa6e25a9afa9972ce7b18f9525 not found: ID does not exist" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.828773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.846972 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.861854 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:39 crc kubenswrapper[4755]: E0317 00:47:39.862329 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-log" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.862345 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-log" Mar 17 00:47:39 crc kubenswrapper[4755]: E0317 00:47:39.862370 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-metadata" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.862375 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-metadata" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.862600 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-metadata" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.862628 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" containerName="nova-metadata-log" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.863720 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.867024 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.867607 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 00:47:39 crc kubenswrapper[4755]: I0317 00:47:39.870944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.001125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2b187e-bb8e-4934-a004-532ea37d2cf2-logs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.001181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.001206 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhxd\" (UniqueName: \"kubernetes.io/projected/6b2b187e-bb8e-4934-a004-532ea37d2cf2-kube-api-access-mrhxd\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.001235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-config-data\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.001460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.103720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.103800 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhxd\" (UniqueName: \"kubernetes.io/projected/6b2b187e-bb8e-4934-a004-532ea37d2cf2-kube-api-access-mrhxd\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.103859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-config-data\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.103975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.104224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2b187e-bb8e-4934-a004-532ea37d2cf2-logs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.104633 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2b187e-bb8e-4934-a004-532ea37d2cf2-logs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.108784 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.110381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.112071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2b187e-bb8e-4934-a004-532ea37d2cf2-config-data\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.127506 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhxd\" (UniqueName: \"kubernetes.io/projected/6b2b187e-bb8e-4934-a004-532ea37d2cf2-kube-api-access-mrhxd\") pod \"nova-metadata-0\" (UID: \"6b2b187e-bb8e-4934-a004-532ea37d2cf2\") " pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.243275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.273071 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fa943d-490d-47c8-b14d-c250dca5c388" path="/var/lib/kubelet/pods/11fa943d-490d-47c8-b14d-c250dca5c388/volumes" Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.519849 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rj6j" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="registry-server" containerID="cri-o://0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12" gracePeriod=2 Mar 17 00:47:40 crc kubenswrapper[4755]: I0317 00:47:40.772113 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 00:47:40 crc kubenswrapper[4755]: W0317 00:47:40.776938 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2b187e_bb8e_4934_a004_532ea37d2cf2.slice/crio-6317a1c6ef777689181bec7e26a3d6319da0bdfe7fb683b9d25e29be1fd56718 WatchSource:0}: Error finding container 6317a1c6ef777689181bec7e26a3d6319da0bdfe7fb683b9d25e29be1fd56718: Status 404 returned error can't find the container with id 6317a1c6ef777689181bec7e26a3d6319da0bdfe7fb683b9d25e29be1fd56718 Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.333137 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.450355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsz7q\" (UniqueName: \"kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q\") pod \"ef714ae3-d651-4583-871a-05c45a82aff7\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.450489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities\") pod \"ef714ae3-d651-4583-871a-05c45a82aff7\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.450571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content\") pod \"ef714ae3-d651-4583-871a-05c45a82aff7\" (UID: \"ef714ae3-d651-4583-871a-05c45a82aff7\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.453966 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities" (OuterVolumeSpecName: "utilities") pod "ef714ae3-d651-4583-871a-05c45a82aff7" (UID: "ef714ae3-d651-4583-871a-05c45a82aff7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.457789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q" (OuterVolumeSpecName: "kube-api-access-bsz7q") pod "ef714ae3-d651-4583-871a-05c45a82aff7" (UID: "ef714ae3-d651-4583-871a-05c45a82aff7"). InnerVolumeSpecName "kube-api-access-bsz7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.512340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef714ae3-d651-4583-871a-05c45a82aff7" (UID: "ef714ae3-d651-4583-871a-05c45a82aff7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.522612 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.533712 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef714ae3-d651-4583-871a-05c45a82aff7" containerID="0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12" exitCode=0 Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.533762 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerDied","Data":"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.533788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj6j" event={"ID":"ef714ae3-d651-4583-871a-05c45a82aff7","Type":"ContainerDied","Data":"0931e69c22e5ae5e0692713703f71d4051fd68ddea40c94f33d543653aaa11c5"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.533803 4755 scope.go:117] "RemoveContainer" containerID="0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.533903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj6j" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.553493 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.553548 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef714ae3-d651-4583-871a-05c45a82aff7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.553564 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsz7q\" (UniqueName: \"kubernetes.io/projected/ef714ae3-d651-4583-871a-05c45a82aff7-kube-api-access-bsz7q\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.556978 4755 generic.go:334] "Generic (PLEG): container finished" podID="8e152b96-8219-4f65-bc13-81007beb33b6" containerID="dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435" exitCode=0 Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.557083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerDied","Data":"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.557115 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8e152b96-8219-4f65-bc13-81007beb33b6","Type":"ContainerDied","Data":"6da56eeb1502e52abb3f1972c14e5f4959597fe391a67ede9d0a21f67eb8dca1"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.557190 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.561744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b2b187e-bb8e-4934-a004-532ea37d2cf2","Type":"ContainerStarted","Data":"81a17ef2ad6f84de8a296ad8aefbed4ef396d2a89f4211ba18a0d1d44d50dd1e"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.561798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b2b187e-bb8e-4934-a004-532ea37d2cf2","Type":"ContainerStarted","Data":"fbbb800dd65cd0fafc237620f96f6f7d538cbc026cf7fa8b84402dddde0a9660"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.561811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6b2b187e-bb8e-4934-a004-532ea37d2cf2","Type":"ContainerStarted","Data":"6317a1c6ef777689181bec7e26a3d6319da0bdfe7fb683b9d25e29be1fd56718"} Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.584667 4755 scope.go:117] "RemoveContainer" containerID="1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.587575 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.598002 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rj6j"] Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.608123 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6080908259999998 podStartE2EDuration="2.608090826s" podCreationTimestamp="2026-03-17 00:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:41.590346043 +0000 UTC m=+1536.349798316" watchObservedRunningTime="2026-03-17 00:47:41.608090826 +0000 UTC m=+1536.367543109" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.611513 4755 scope.go:117] "RemoveContainer" containerID="c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.633969 4755 scope.go:117] "RemoveContainer" containerID="0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.634363 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12\": container with ID starting with 0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12 not found: ID does not exist" containerID="0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.634392 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12"} err="failed to get container status \"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12\": rpc error: code = NotFound desc = could not find container \"0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12\": container with ID starting with 0c43f2b1169e21e8920e9c8f3fabc942ebd399e369ddb7a1964183c60c98ee12 not found: ID does not exist" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.634412 4755 scope.go:117] "RemoveContainer" containerID="1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.634654 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd\": container with ID starting with 1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd not found: ID does not exist" containerID="1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.634675 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd"} err="failed to get container status \"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd\": rpc error: code = NotFound desc = could not find container \"1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd\": container with ID starting with 1f1a0d750acf78dec898f1bb81cbeacf0882d419a7801ca10975d9fee6aeedfd not found: ID does not exist" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.634688 4755 scope.go:117] "RemoveContainer" containerID="c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.635280 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8\": container with ID starting with c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8 not found: ID does not exist" containerID="c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.635309 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8"} err="failed to get container status \"c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8\": rpc error: code = NotFound desc = could not find container \"c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8\": container with ID starting with c6cb5c070406023f635d2fb949a6889a2b0a77caecb0d46f91a58ab61ae6f2e8 not found: ID does not exist" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.635345 4755 scope.go:117] "RemoveContainer" containerID="dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.654177 4755 scope.go:117] "RemoveContainer" containerID="2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.654966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs2sk\" (UniqueName: \"kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655113 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655562 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655589 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655934 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs\") pod \"8e152b96-8219-4f65-bc13-81007beb33b6\" (UID: \"8e152b96-8219-4f65-bc13-81007beb33b6\") " Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.655644 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs" (OuterVolumeSpecName: "logs") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.656563 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e152b96-8219-4f65-bc13-81007beb33b6-logs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.658585 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk" (OuterVolumeSpecName: "kube-api-access-vs2sk") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "kube-api-access-vs2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.675868 4755 scope.go:117] "RemoveContainer" containerID="dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.678970 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435\": container with ID starting with dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435 not found: ID does not exist" containerID="dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.679009 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435"} err="failed to get container status \"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435\": rpc error: code = NotFound desc = could not find container \"dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435\": container with ID starting with dc909fdd1f89b47bfee30d2136a956e1ee4c7bd759734cce0e61bd43ba0e9435 not found: ID does not exist" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.679036 4755 scope.go:117] "RemoveContainer" containerID="2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.679332 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699\": container with ID starting with 2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699 not found: ID does not exist" containerID="2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.679353 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699"} err="failed to get container status \"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699\": rpc error: code = NotFound desc = could not find container \"2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699\": container with ID starting with 2899d12604d5dd6b36c02d7eabf25455060e2bde1f210e49aa1f0f2ad2d15699 not found: ID does not exist" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.682565 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data" (OuterVolumeSpecName: "config-data") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.688527 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.707743 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.708542 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8e152b96-8219-4f65-bc13-81007beb33b6" (UID: "8e152b96-8219-4f65-bc13-81007beb33b6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.758128 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.758164 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.758181 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.758193 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e152b96-8219-4f65-bc13-81007beb33b6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.758208 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs2sk\" (UniqueName: \"kubernetes.io/projected/8e152b96-8219-4f65-bc13-81007beb33b6-kube-api-access-vs2sk\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.899416 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.916661 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.927343 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.927925 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-log" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.927947 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-log" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.927961 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-api" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.927968 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-api" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.927981 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="extract-content" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.927990 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="extract-content" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.928006 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="extract-utilities" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.928013 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="extract-utilities" Mar 17 00:47:41 crc kubenswrapper[4755]: E0317 00:47:41.928047 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="registry-server" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.928055 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="registry-server" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.928321 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-log" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.928347 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" containerName="registry-server" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.928358 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" containerName="nova-api-api" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.929642 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.949647 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.950142 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.950466 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 00:47:41 crc kubenswrapper[4755]: I0317 00:47:41.965653 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.067648 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-logs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.067735 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pdv\" (UniqueName: \"kubernetes.io/projected/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-kube-api-access-p2pdv\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.067981 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.068069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-config-data\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.068314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.068409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170228 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-config-data\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-logs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.170567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pdv\" (UniqueName: \"kubernetes.io/projected/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-kube-api-access-p2pdv\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.173426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-logs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.179517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-config-data\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.179537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-public-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.182318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.184259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.196695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pdv\" (UniqueName: \"kubernetes.io/projected/c316d4cb-fdc3-45e6-b679-14a04b2b32c1-kube-api-access-p2pdv\") pod \"nova-api-0\" (UID: \"c316d4cb-fdc3-45e6-b679-14a04b2b32c1\") " pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.276860 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e152b96-8219-4f65-bc13-81007beb33b6" path="/var/lib/kubelet/pods/8e152b96-8219-4f65-bc13-81007beb33b6/volumes" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.280366 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef714ae3-d651-4583-871a-05c45a82aff7" path="/var/lib/kubelet/pods/ef714ae3-d651-4583-871a-05c45a82aff7/volumes" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.298749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.807101 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 00:47:42 crc kubenswrapper[4755]: W0317 00:47:42.808071 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc316d4cb_fdc3_45e6_b679_14a04b2b32c1.slice/crio-c7fdbe67c6289cfc85bbffd4f35d6c047b2804a9f1384933b30f0c0acca3f8f4 WatchSource:0}: Error finding container c7fdbe67c6289cfc85bbffd4f35d6c047b2804a9f1384933b30f0c0acca3f8f4: Status 404 returned error can't find the container with id c7fdbe67c6289cfc85bbffd4f35d6c047b2804a9f1384933b30f0c0acca3f8f4 Mar 17 00:47:42 crc kubenswrapper[4755]: I0317 00:47:42.853469 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 00:47:43 crc kubenswrapper[4755]: I0317 00:47:43.589929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c316d4cb-fdc3-45e6-b679-14a04b2b32c1","Type":"ContainerStarted","Data":"50cfc3f00bfd402db74b232882f21a7c4c7dd15629c9a7103baa5573d54ba175"} Mar 17 00:47:43 crc kubenswrapper[4755]: I0317 00:47:43.590201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c316d4cb-fdc3-45e6-b679-14a04b2b32c1","Type":"ContainerStarted","Data":"ca31b0a02a8633333ef60bfe3cb4d1ffba2fcf18a6bee37fb599889f1e8f9f93"} Mar 17 00:47:43 crc kubenswrapper[4755]: I0317 00:47:43.590219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c316d4cb-fdc3-45e6-b679-14a04b2b32c1","Type":"ContainerStarted","Data":"c7fdbe67c6289cfc85bbffd4f35d6c047b2804a9f1384933b30f0c0acca3f8f4"} Mar 17 00:47:43 crc kubenswrapper[4755]: I0317 00:47:43.614746 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.614729962 podStartE2EDuration="2.614729962s" podCreationTimestamp="2026-03-17 00:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:47:43.614288031 +0000 UTC m=+1538.373740334" watchObservedRunningTime="2026-03-17 00:47:43.614729962 +0000 UTC m=+1538.374182245" Mar 17 00:47:47 crc kubenswrapper[4755]: I0317 00:47:47.853258 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 00:47:47 crc kubenswrapper[4755]: I0317 00:47:47.903322 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 00:47:48 crc kubenswrapper[4755]: I0317 00:47:48.718987 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 00:47:50 crc kubenswrapper[4755]: I0317 00:47:50.244285 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 00:47:50 crc kubenswrapper[4755]: I0317 00:47:50.245474 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 00:47:51 crc kubenswrapper[4755]: I0317 00:47:51.258687 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6b2b187e-bb8e-4934-a004-532ea37d2cf2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:51 crc kubenswrapper[4755]: I0317 00:47:51.259675 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6b2b187e-bb8e-4934-a004-532ea37d2cf2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:52 crc kubenswrapper[4755]: I0317 00:47:52.299185 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:52 crc kubenswrapper[4755]: I0317 00:47:52.299624 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 00:47:53 crc kubenswrapper[4755]: I0317 00:47:53.313656 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c316d4cb-fdc3-45e6-b679-14a04b2b32c1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.252:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:53 crc kubenswrapper[4755]: I0317 00:47:53.313675 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c316d4cb-fdc3-45e6-b679-14a04b2b32c1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.252:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.757929 4755 generic.go:334] "Generic (PLEG): container finished" podID="72a40ce1-242d-4422-936b-9b867c51ee69" containerID="2ced09580f45a693b419eeafb685780d8f4153cc4b84ea27eef328853d9ee1f0" exitCode=137 Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.758002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerDied","Data":"2ced09580f45a693b419eeafb685780d8f4153cc4b84ea27eef328853d9ee1f0"} Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.758685 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72a40ce1-242d-4422-936b-9b867c51ee69","Type":"ContainerDied","Data":"56eee3693547c074ab712daeca0600e0cbd5e9387732a1f55506a59635e103b9"} Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.758711 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56eee3693547c074ab712daeca0600e0cbd5e9387732a1f55506a59635e103b9" Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.784741 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.915137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle\") pod \"72a40ce1-242d-4422-936b-9b867c51ee69\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.915211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data\") pod \"72a40ce1-242d-4422-936b-9b867c51ee69\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.915257 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts\") pod \"72a40ce1-242d-4422-936b-9b867c51ee69\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.915288 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2wdv\" (UniqueName: \"kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv\") pod \"72a40ce1-242d-4422-936b-9b867c51ee69\" (UID: \"72a40ce1-242d-4422-936b-9b867c51ee69\") " Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.922446 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv" (OuterVolumeSpecName: "kube-api-access-j2wdv") pod "72a40ce1-242d-4422-936b-9b867c51ee69" (UID: "72a40ce1-242d-4422-936b-9b867c51ee69"). InnerVolumeSpecName "kube-api-access-j2wdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:47:54 crc kubenswrapper[4755]: I0317 00:47:54.931679 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts" (OuterVolumeSpecName: "scripts") pod "72a40ce1-242d-4422-936b-9b867c51ee69" (UID: "72a40ce1-242d-4422-936b-9b867c51ee69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.019206 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.019257 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2wdv\" (UniqueName: \"kubernetes.io/projected/72a40ce1-242d-4422-936b-9b867c51ee69-kube-api-access-j2wdv\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.040715 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data" (OuterVolumeSpecName: "config-data") pod "72a40ce1-242d-4422-936b-9b867c51ee69" (UID: "72a40ce1-242d-4422-936b-9b867c51ee69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.051312 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a40ce1-242d-4422-936b-9b867c51ee69" (UID: "72a40ce1-242d-4422-936b-9b867c51ee69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.121714 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.121941 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a40ce1-242d-4422-936b-9b867c51ee69-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.769547 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.821212 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.833464 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.847306 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:55 crc kubenswrapper[4755]: E0317 00:47:55.847808 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-evaluator" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.847826 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-evaluator" Mar 17 00:47:55 crc kubenswrapper[4755]: E0317 00:47:55.847839 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-api" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.847846 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-api" Mar 17 00:47:55 crc kubenswrapper[4755]: E0317 00:47:55.847858 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-listener" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.847864 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-listener" Mar 17 00:47:55 crc kubenswrapper[4755]: E0317 00:47:55.847898 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-notifier" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.847904 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-notifier" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.848091 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-listener" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.848110 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-notifier" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.848124 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-api" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.848135 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" containerName="aodh-evaluator" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.849944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.858758 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.859068 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5g9jb" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.859232 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.859267 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.859424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.888688 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.941876 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.941955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.942207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgrb\" (UniqueName: \"kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.942327 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.942396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:55 crc kubenswrapper[4755]: I0317 00:47:55.942492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.044590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.044949 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgrb\" (UniqueName: \"kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.045000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.045047 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.045080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.045132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.051814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.052531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.053181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.055264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.064391 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.081638 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgrb\" (UniqueName: \"kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb\") pod \"aodh-0\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.180489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.266478 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a40ce1-242d-4422-936b-9b867c51ee69" path="/var/lib/kubelet/pods/72a40ce1-242d-4422-936b-9b867c51ee69/volumes" Mar 17 00:47:56 crc kubenswrapper[4755]: W0317 00:47:56.672073 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef911263_adb5_4a18_b726_888ec33cb66a.slice/crio-8e5a09a16798d85166c4499888e6ae5b45022062a7c8b4a0fd1cc97552c6e408 WatchSource:0}: Error finding container 8e5a09a16798d85166c4499888e6ae5b45022062a7c8b4a0fd1cc97552c6e408: Status 404 returned error can't find the container with id 8e5a09a16798d85166c4499888e6ae5b45022062a7c8b4a0fd1cc97552c6e408 Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.680225 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.739301 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 00:47:56 crc kubenswrapper[4755]: I0317 00:47:56.786155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerStarted","Data":"8e5a09a16798d85166c4499888e6ae5b45022062a7c8b4a0fd1cc97552c6e408"} Mar 17 00:47:57 crc kubenswrapper[4755]: I0317 00:47:57.797527 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerStarted","Data":"70bddc077d3511cc9064594aef0f7e8a253e1aa1e2a8106a77471972eeea8e12"} Mar 17 00:47:58 crc kubenswrapper[4755]: I0317 00:47:58.243570 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 00:47:58 crc kubenswrapper[4755]: I0317 00:47:58.243823 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 00:47:58 crc kubenswrapper[4755]: I0317 00:47:58.338175 4755 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0292185b-3c12-4b25-b900-f8c7c5d4346f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0292185b-3c12-4b25-b900-f8c7c5d4346f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0292185b_3c12_4b25_b900_f8c7c5d4346f.slice" Mar 17 00:47:58 crc kubenswrapper[4755]: I0317 00:47:58.817174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerStarted","Data":"c47c6f774e1ffc33a220736fe324ab9a87ea0df635326870d923007284d26426"} Mar 17 00:47:59 crc kubenswrapper[4755]: I0317 00:47:59.828950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerStarted","Data":"c30036fbb6c41d2c7dada4fcf02401c9a4dffe0167f192a9cce1907ad319ef2a"} Mar 17 00:47:59 crc kubenswrapper[4755]: I0317 00:47:59.828993 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerStarted","Data":"0a5c23131230a386b5676ad6a14217859b8188793c81bcd68455cc986f58ec34"} Mar 17 00:47:59 crc kubenswrapper[4755]: I0317 00:47:59.860917 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.088906442 podStartE2EDuration="4.860889283s" podCreationTimestamp="2026-03-17 00:47:55 +0000 UTC" firstStartedPulling="2026-03-17 00:47:56.67486505 +0000 UTC m=+1551.434317333" lastFinishedPulling="2026-03-17 00:47:59.446847861 +0000 UTC m=+1554.206300174" observedRunningTime="2026-03-17 00:47:59.847492994 +0000 UTC m=+1554.606945277" watchObservedRunningTime="2026-03-17 00:47:59.860889283 +0000 UTC m=+1554.620341566" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.140518 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561808-s48qq"] Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.141755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.144127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.144409 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.145368 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.158975 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561808-s48qq"] Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.259782 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.259852 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.265291 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.269856 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.299090 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.299148 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.338353 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl7lg\" (UniqueName: \"kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg\") pod \"auto-csr-approver-29561808-s48qq\" (UID: \"12958939-163c-488e-9297-548add9a591b\") " pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.440597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl7lg\" (UniqueName: \"kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg\") pod \"auto-csr-approver-29561808-s48qq\" (UID: \"12958939-163c-488e-9297-548add9a591b\") " pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.470981 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl7lg\" (UniqueName: \"kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg\") pod \"auto-csr-approver-29561808-s48qq\" (UID: \"12958939-163c-488e-9297-548add9a591b\") " pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:00 crc kubenswrapper[4755]: I0317 00:48:00.760999 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.314123 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561808-s48qq"] Mar 17 00:48:01 crc kubenswrapper[4755]: W0317 00:48:01.320769 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12958939_163c_488e_9297_548add9a591b.slice/crio-2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592 WatchSource:0}: Error finding container 2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592: Status 404 returned error can't find the container with id 2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592 Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.477761 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.477970 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" containerName="kube-state-metrics" containerID="cri-o://350c95c21bdae52d5cd2e115c88bacec66c9dc68f94bad75c046d44a0d4c92bc" gracePeriod=30 Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.557153 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.557668 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" containerName="mysqld-exporter" containerID="cri-o://e8a560b3af01b330cccfc8d71e871639271e3c56d0bd1e7ee86c8d821f29d807" gracePeriod=30 Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.862709 4755 generic.go:334] "Generic (PLEG): container finished" podID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" containerID="e8a560b3af01b330cccfc8d71e871639271e3c56d0bd1e7ee86c8d821f29d807" exitCode=2 Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.862765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"42a2a1da-3480-4f1d-bba8-a725657e9fcd","Type":"ContainerDied","Data":"e8a560b3af01b330cccfc8d71e871639271e3c56d0bd1e7ee86c8d821f29d807"} Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.864169 4755 generic.go:334] "Generic (PLEG): container finished" podID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" containerID="350c95c21bdae52d5cd2e115c88bacec66c9dc68f94bad75c046d44a0d4c92bc" exitCode=2 Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.864208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820","Type":"ContainerDied","Data":"350c95c21bdae52d5cd2e115c88bacec66c9dc68f94bad75c046d44a0d4c92bc"} Mar 17 00:48:01 crc kubenswrapper[4755]: I0317 00:48:01.866265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561808-s48qq" event={"ID":"12958939-163c-488e-9297-548add9a591b","Type":"ContainerStarted","Data":"2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592"} Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.076218 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.081373 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.197212 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc249\" (UniqueName: \"kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249\") pod \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.197320 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data\") pod \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.197474 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle\") pod \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\" (UID: \"42a2a1da-3480-4f1d-bba8-a725657e9fcd\") " Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.197860 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfb2g\" (UniqueName: \"kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g\") pod \"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820\" (UID: \"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820\") " Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.203832 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249" (OuterVolumeSpecName: "kube-api-access-wc249") pod "42a2a1da-3480-4f1d-bba8-a725657e9fcd" (UID: "42a2a1da-3480-4f1d-bba8-a725657e9fcd"). InnerVolumeSpecName "kube-api-access-wc249". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.207704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g" (OuterVolumeSpecName: "kube-api-access-nfb2g") pod "cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" (UID: "cd3ff4b6-bb9e-446c-9221-f80d9ec5a820"). InnerVolumeSpecName "kube-api-access-nfb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.253822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42a2a1da-3480-4f1d-bba8-a725657e9fcd" (UID: "42a2a1da-3480-4f1d-bba8-a725657e9fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.275894 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data" (OuterVolumeSpecName: "config-data") pod "42a2a1da-3480-4f1d-bba8-a725657e9fcd" (UID: "42a2a1da-3480-4f1d-bba8-a725657e9fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.299349 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.299376 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfb2g\" (UniqueName: \"kubernetes.io/projected/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820-kube-api-access-nfb2g\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.299429 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc249\" (UniqueName: \"kubernetes.io/projected/42a2a1da-3480-4f1d-bba8-a725657e9fcd-kube-api-access-wc249\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.299667 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a2a1da-3480-4f1d-bba8-a725657e9fcd-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.306583 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.307728 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.312467 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.881466 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd3ff4b6-bb9e-446c-9221-f80d9ec5a820","Type":"ContainerDied","Data":"3afa8b0e1e0de7f9825062f6929f3d4ddccef531465d134f26ca1e57a7a80516"} Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.881901 4755 scope.go:117] "RemoveContainer" containerID="350c95c21bdae52d5cd2e115c88bacec66c9dc68f94bad75c046d44a0d4c92bc" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.882910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.898280 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.898696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"42a2a1da-3480-4f1d-bba8-a725657e9fcd","Type":"ContainerDied","Data":"23e8a1f114e5f2a203cff854ba2b4957c4175c20683c8e37e7d4569a461d1c9a"} Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.926906 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.927189 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.937652 4755 scope.go:117] "RemoveContainer" containerID="e8a560b3af01b330cccfc8d71e871639271e3c56d0bd1e7ee86c8d821f29d807" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.949466 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.972469 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:02 crc kubenswrapper[4755]: E0317 00:48:02.972869 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" containerName="kube-state-metrics" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.972881 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" containerName="kube-state-metrics" Mar 17 00:48:02 crc kubenswrapper[4755]: E0317 00:48:02.972911 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" containerName="mysqld-exporter" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.972918 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" containerName="mysqld-exporter" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.973104 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" containerName="mysqld-exporter" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.973122 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" containerName="kube-state-metrics" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.973842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.982127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.982209 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 17 00:48:02 crc kubenswrapper[4755]: I0317 00:48:02.986789 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.012669 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.015674 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jkvc\" (UniqueName: \"kubernetes.io/projected/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-api-access-7jkvc\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.015768 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.015790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.015810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.040384 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.068230 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.069855 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.071922 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.072023 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.085499 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-config-data\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdpm\" (UniqueName: \"kubernetes.io/projected/6fb69f1a-4500-4441-a103-843887d04772-kube-api-access-vkdpm\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jkvc\" (UniqueName: \"kubernetes.io/projected/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-api-access-7jkvc\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.130841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.131239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.131513 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.135301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.136353 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.146235 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jkvc\" (UniqueName: \"kubernetes.io/projected/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-api-access-7jkvc\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.149300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/107b2153-2013-45e0-ad48-0f16e97d6d7e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"107b2153-2013-45e0-ad48-0f16e97d6d7e\") " pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.233279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.233338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.233394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-config-data\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.233416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdpm\" (UniqueName: \"kubernetes.io/projected/6fb69f1a-4500-4441-a103-843887d04772-kube-api-access-vkdpm\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.236975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.237184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-config-data\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.243546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb69f1a-4500-4441-a103-843887d04772-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.257475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdpm\" (UniqueName: \"kubernetes.io/projected/6fb69f1a-4500-4441-a103-843887d04772-kube-api-access-vkdpm\") pod \"mysqld-exporter-0\" (UID: \"6fb69f1a-4500-4441-a103-843887d04772\") " pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.303999 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.394875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 17 00:48:03 crc kubenswrapper[4755]: W0317 00:48:03.769042 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod107b2153_2013_45e0_ad48_0f16e97d6d7e.slice/crio-9d69f8b83bb3673c7f89bdc7c76c8cd3809494da6aaf971124f9267a24d1e61c WatchSource:0}: Error finding container 9d69f8b83bb3673c7f89bdc7c76c8cd3809494da6aaf971124f9267a24d1e61c: Status 404 returned error can't find the container with id 9d69f8b83bb3673c7f89bdc7c76c8cd3809494da6aaf971124f9267a24d1e61c Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.771022 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.821130 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.821511 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="proxy-httpd" containerID="cri-o://9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" gracePeriod=30 Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.821528 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="sg-core" containerID="cri-o://5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" gracePeriod=30 Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.821662 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-notification-agent" containerID="cri-o://35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" gracePeriod=30 Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.821704 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-central-agent" containerID="cri-o://674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" gracePeriod=30 Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.910846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"107b2153-2013-45e0-ad48-0f16e97d6d7e","Type":"ContainerStarted","Data":"9d69f8b83bb3673c7f89bdc7c76c8cd3809494da6aaf971124f9267a24d1e61c"} Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.912204 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.934045 4755 generic.go:334] "Generic (PLEG): container finished" podID="12958939-163c-488e-9297-548add9a591b" containerID="8597b5a97df6787d62e4199a6a1b84371d448cef522ecec88bb44f1883acd6cd" exitCode=0 Mar 17 00:48:03 crc kubenswrapper[4755]: I0317 00:48:03.934156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561808-s48qq" event={"ID":"12958939-163c-488e-9297-548add9a591b","Type":"ContainerDied","Data":"8597b5a97df6787d62e4199a6a1b84371d448cef522ecec88bb44f1883acd6cd"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.271299 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a2a1da-3480-4f1d-bba8-a725657e9fcd" path="/var/lib/kubelet/pods/42a2a1da-3480-4f1d-bba8-a725657e9fcd/volumes" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.271831 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3ff4b6-bb9e-446c-9221-f80d9ec5a820" path="/var/lib/kubelet/pods/cd3ff4b6-bb9e-446c-9221-f80d9ec5a820/volumes" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.856989 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.949255 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"107b2153-2013-45e0-ad48-0f16e97d6d7e","Type":"ContainerStarted","Data":"66a2f5cb868fb7167f7194a687c9d0d4ca2a9c1829af51d09b263cbe68508499"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.949375 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.951371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6fb69f1a-4500-4441-a103-843887d04772","Type":"ContainerStarted","Data":"c26d793a456dda67931fbcba99e11587ce7f0c128b54847120f4fb347f56c730"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.951414 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6fb69f1a-4500-4441-a103-843887d04772","Type":"ContainerStarted","Data":"8a57853787205e24cdb2af8fbe1ca6012320dcd4aea2c9b8520957491e37bd01"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955832 4755 generic.go:334] "Generic (PLEG): container finished" podID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" exitCode=0 Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955855 4755 generic.go:334] "Generic (PLEG): container finished" podID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" exitCode=2 Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955861 4755 generic.go:334] "Generic (PLEG): container finished" podID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" exitCode=0 Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955868 4755 generic.go:334] "Generic (PLEG): container finished" podID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" exitCode=0 Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerDied","Data":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955936 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerDied","Data":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerDied","Data":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerDied","Data":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955968 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5db525fb-ff43-48c4-8f43-1d9eb37c440b","Type":"ContainerDied","Data":"ffb1f9e4e405a8a8bf18925810e40d9f03d20f437cd1e2e5f2000c2a92fc116b"} Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955943 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.955994 4755 scope.go:117] "RemoveContainer" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.970770 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.549931064 podStartE2EDuration="2.970748097s" podCreationTimestamp="2026-03-17 00:48:02 +0000 UTC" firstStartedPulling="2026-03-17 00:48:03.771950802 +0000 UTC m=+1558.531403085" lastFinishedPulling="2026-03-17 00:48:04.192767825 +0000 UTC m=+1558.952220118" observedRunningTime="2026-03-17 00:48:04.961721176 +0000 UTC m=+1559.721173459" watchObservedRunningTime="2026-03-17 00:48:04.970748097 +0000 UTC m=+1559.730200380" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986007 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986233 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986333 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phqmr\" (UniqueName: \"kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.986467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle\") pod \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\" (UID: \"5db525fb-ff43-48c4-8f43-1d9eb37c440b\") " Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.987561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.987574 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.994349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.427223894 podStartE2EDuration="2.994331707s" podCreationTimestamp="2026-03-17 00:48:02 +0000 UTC" firstStartedPulling="2026-03-17 00:48:03.93155222 +0000 UTC m=+1558.691004503" lastFinishedPulling="2026-03-17 00:48:04.498660043 +0000 UTC m=+1559.258112316" observedRunningTime="2026-03-17 00:48:04.982214613 +0000 UTC m=+1559.741666886" watchObservedRunningTime="2026-03-17 00:48:04.994331707 +0000 UTC m=+1559.753783990" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.996973 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts" (OuterVolumeSpecName: "scripts") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:04 crc kubenswrapper[4755]: I0317 00:48:04.996986 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr" (OuterVolumeSpecName: "kube-api-access-phqmr") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "kube-api-access-phqmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.008163 4755 scope.go:117] "RemoveContainer" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.037697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.090785 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.090807 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5db525fb-ff43-48c4-8f43-1d9eb37c440b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.090816 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phqmr\" (UniqueName: \"kubernetes.io/projected/5db525fb-ff43-48c4-8f43-1d9eb37c440b-kube-api-access-phqmr\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.090827 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.090835 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.105303 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.112639 4755 scope.go:117] "RemoveContainer" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.138123 4755 scope.go:117] "RemoveContainer" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.142593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data" (OuterVolumeSpecName: "config-data") pod "5db525fb-ff43-48c4-8f43-1d9eb37c440b" (UID: "5db525fb-ff43-48c4-8f43-1d9eb37c440b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.166684 4755 scope.go:117] "RemoveContainer" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.167178 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": container with ID starting with 9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c not found: ID does not exist" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.167218 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} err="failed to get container status \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": rpc error: code = NotFound desc = could not find container \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": container with ID starting with 9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.167243 4755 scope.go:117] "RemoveContainer" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.167737 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": container with ID starting with 5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b not found: ID does not exist" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.167776 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} err="failed to get container status \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": rpc error: code = NotFound desc = could not find container \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": container with ID starting with 5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.167807 4755 scope.go:117] "RemoveContainer" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.168060 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": container with ID starting with 35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5 not found: ID does not exist" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168086 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} err="failed to get container status \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": rpc error: code = NotFound desc = could not find container \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": container with ID starting with 35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168101 4755 scope.go:117] "RemoveContainer" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.168343 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": container with ID starting with 674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216 not found: ID does not exist" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168387 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} err="failed to get container status \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": rpc error: code = NotFound desc = could not find container \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": container with ID starting with 674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168409 4755 scope.go:117] "RemoveContainer" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168652 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} err="failed to get container status \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": rpc error: code = NotFound desc = could not find container \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": container with ID starting with 9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168671 4755 scope.go:117] "RemoveContainer" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168877 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} err="failed to get container status \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": rpc error: code = NotFound desc = could not find container \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": container with ID starting with 5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.168897 4755 scope.go:117] "RemoveContainer" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.178729 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} err="failed to get container status \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": rpc error: code = NotFound desc = could not find container \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": container with ID starting with 35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.178762 4755 scope.go:117] "RemoveContainer" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.179393 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} err="failed to get container status \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": rpc error: code = NotFound desc = could not find container \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": container with ID starting with 674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.179466 4755 scope.go:117] "RemoveContainer" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.179716 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} err="failed to get container status \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": rpc error: code = NotFound desc = could not find container \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": container with ID starting with 9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.179746 4755 scope.go:117] "RemoveContainer" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.179982 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} err="failed to get container status \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": rpc error: code = NotFound desc = could not find container \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": container with ID starting with 5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180005 4755 scope.go:117] "RemoveContainer" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180236 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} err="failed to get container status \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": rpc error: code = NotFound desc = could not find container \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": container with ID starting with 35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180255 4755 scope.go:117] "RemoveContainer" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180460 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} err="failed to get container status \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": rpc error: code = NotFound desc = could not find container \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": container with ID starting with 674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180476 4755 scope.go:117] "RemoveContainer" containerID="9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180661 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c"} err="failed to get container status \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": rpc error: code = NotFound desc = could not find container \"9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c\": container with ID starting with 9ca78ee249decc3c28e637f63c66f8cf666787d6218448e94119b9d0d30f642c not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.180679 4755 scope.go:117] "RemoveContainer" containerID="5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.185023 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b"} err="failed to get container status \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": rpc error: code = NotFound desc = could not find container \"5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b\": container with ID starting with 5857b684689231d31c6ba9e37823aafb2e54e39d40641d6ca9b29e5af069782b not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.185043 4755 scope.go:117] "RemoveContainer" containerID="35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.185869 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5"} err="failed to get container status \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": rpc error: code = NotFound desc = could not find container \"35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5\": container with ID starting with 35246f4ac7e07a2ec804e2e7e0a8e0b58520a21322483246310e716220cf11d5 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.185904 4755 scope.go:117] "RemoveContainer" containerID="674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.186691 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216"} err="failed to get container status \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": rpc error: code = NotFound desc = could not find container \"674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216\": container with ID starting with 674e3a10f5c852303d9c5b88f6a0074c9603711de9b5208dd325f64469eb3216 not found: ID does not exist" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.192346 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.192383 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db525fb-ff43-48c4-8f43-1d9eb37c440b-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.286143 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.316554 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.331537 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342016 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.342414 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="proxy-httpd" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342446 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="proxy-httpd" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.342482 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12958939-163c-488e-9297-548add9a591b" containerName="oc" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342489 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12958939-163c-488e-9297-548add9a591b" containerName="oc" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.342499 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-central-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342505 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-central-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.342515 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-notification-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342522 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-notification-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: E0317 00:48:05.342538 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="sg-core" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342545 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="sg-core" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342707 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-central-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342732 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12958939-163c-488e-9297-548add9a591b" containerName="oc" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342743 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="sg-core" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342755 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="proxy-httpd" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.342772 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" containerName="ceilometer-notification-agent" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.344452 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.346869 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.347241 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.347675 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.364548 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.398321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl7lg\" (UniqueName: \"kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg\") pod \"12958939-163c-488e-9297-548add9a591b\" (UID: \"12958939-163c-488e-9297-548add9a591b\") " Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.398960 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399000 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399228 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399259 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.399282 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdql\" (UniqueName: \"kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.404284 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg" (OuterVolumeSpecName: "kube-api-access-wl7lg") pod "12958939-163c-488e-9297-548add9a591b" (UID: "12958939-163c-488e-9297-548add9a591b"). InnerVolumeSpecName "kube-api-access-wl7lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.500973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.501407 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.501668 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502170 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdql\" (UniqueName: \"kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502633 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.502796 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl7lg\" (UniqueName: \"kubernetes.io/projected/12958939-163c-488e-9297-548add9a591b-kube-api-access-wl7lg\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.503091 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.505747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.506376 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.506948 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.509598 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.518637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.521923 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdql\" (UniqueName: \"kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql\") pod \"ceilometer-0\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.668403 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.970416 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561808-s48qq" event={"ID":"12958939-163c-488e-9297-548add9a591b","Type":"ContainerDied","Data":"2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592"} Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.970875 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f7bf911b77f32320aa1528877736383d5872bbb874cd435fda098b667fb9592" Mar 17 00:48:05 crc kubenswrapper[4755]: I0317 00:48:05.970647 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561808-s48qq" Mar 17 00:48:06 crc kubenswrapper[4755]: W0317 00:48:06.170804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e4fad4_5277_41a9_a93d_63f22e1c2a24.slice/crio-541549918db1132ff08ee56b3e4ec9faf79e6d7d6deb76096b0dacf47a789d8f WatchSource:0}: Error finding container 541549918db1132ff08ee56b3e4ec9faf79e6d7d6deb76096b0dacf47a789d8f: Status 404 returned error can't find the container with id 541549918db1132ff08ee56b3e4ec9faf79e6d7d6deb76096b0dacf47a789d8f Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.182029 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.266839 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db525fb-ff43-48c4-8f43-1d9eb37c440b" path="/var/lib/kubelet/pods/5db525fb-ff43-48c4-8f43-1d9eb37c440b/volumes" Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.381504 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561802-thhsv"] Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.393043 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561802-thhsv"] Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.983203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerStarted","Data":"28661ddf61d440162d2f2f2ae23747bcd42a631b38a45b6f76abdd1401619b09"} Mar 17 00:48:06 crc kubenswrapper[4755]: I0317 00:48:06.983524 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerStarted","Data":"541549918db1132ff08ee56b3e4ec9faf79e6d7d6deb76096b0dacf47a789d8f"} Mar 17 00:48:07 crc kubenswrapper[4755]: I0317 00:48:07.996719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerStarted","Data":"904638d6542cec9f48e476bd1a78979315e04950796b0f6b5a0422c7992695e6"} Mar 17 00:48:08 crc kubenswrapper[4755]: I0317 00:48:08.271244 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de73bb77-da64-4a58-bf81-d617192672f2" path="/var/lib/kubelet/pods/de73bb77-da64-4a58-bf81-d617192672f2/volumes" Mar 17 00:48:09 crc kubenswrapper[4755]: I0317 00:48:09.008225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerStarted","Data":"6466bb22812da1cddd6481a2ac1a6e75770aff4b8ccf86a209615d03d95bb492"} Mar 17 00:48:11 crc kubenswrapper[4755]: I0317 00:48:11.037368 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerStarted","Data":"88d1a86a8784e93979f5d24fd9481f9231dd5948bb6bacc52b087f29d63fb36f"} Mar 17 00:48:11 crc kubenswrapper[4755]: I0317 00:48:11.038175 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:48:11 crc kubenswrapper[4755]: I0317 00:48:11.073865 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.452642182 podStartE2EDuration="6.073842009s" podCreationTimestamp="2026-03-17 00:48:05 +0000 UTC" firstStartedPulling="2026-03-17 00:48:06.172938183 +0000 UTC m=+1560.932390486" lastFinishedPulling="2026-03-17 00:48:09.79413803 +0000 UTC m=+1564.553590313" observedRunningTime="2026-03-17 00:48:11.067382925 +0000 UTC m=+1565.826835248" watchObservedRunningTime="2026-03-17 00:48:11.073842009 +0000 UTC m=+1565.833294332" Mar 17 00:48:13 crc kubenswrapper[4755]: I0317 00:48:13.318430 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.614323 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.619129 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.652060 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.720728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.720847 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29jt2\" (UniqueName: \"kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.721229 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.823589 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29jt2\" (UniqueName: \"kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.823849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.823945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.824559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.824727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.848206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29jt2\" (UniqueName: \"kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2\") pod \"community-operators-pv6h6\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:17 crc kubenswrapper[4755]: I0317 00:48:17.956432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:18 crc kubenswrapper[4755]: I0317 00:48:18.595468 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 00:48:19 crc kubenswrapper[4755]: I0317 00:48:19.159342 4755 generic.go:334] "Generic (PLEG): container finished" podID="858527df-ec56-4a19-b64f-735380803797" containerID="dc20771934207323bf31c040fdbae8287ebeab0bb54f0a383c658d1101722142" exitCode=0 Mar 17 00:48:19 crc kubenswrapper[4755]: I0317 00:48:19.160426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerDied","Data":"dc20771934207323bf31c040fdbae8287ebeab0bb54f0a383c658d1101722142"} Mar 17 00:48:19 crc kubenswrapper[4755]: I0317 00:48:19.161455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerStarted","Data":"825472bcbf41b959569efe7f0fddcb91f50606fc529bf2c3ab808eb12dc2578c"} Mar 17 00:48:24 crc kubenswrapper[4755]: I0317 00:48:24.235796 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerStarted","Data":"15df6ca71a2f72ab6fa9564ccae9569d55ae0c8b322d1076c492b29da09f3f7b"} Mar 17 00:48:25 crc kubenswrapper[4755]: I0317 00:48:25.251757 4755 generic.go:334] "Generic (PLEG): container finished" podID="858527df-ec56-4a19-b64f-735380803797" containerID="15df6ca71a2f72ab6fa9564ccae9569d55ae0c8b322d1076c492b29da09f3f7b" exitCode=0 Mar 17 00:48:25 crc kubenswrapper[4755]: I0317 00:48:25.251957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerDied","Data":"15df6ca71a2f72ab6fa9564ccae9569d55ae0c8b322d1076c492b29da09f3f7b"} Mar 17 00:48:26 crc kubenswrapper[4755]: I0317 00:48:26.286859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerStarted","Data":"a0fd6a1b016965c1e76c9020787ef2cc07c6585fca45be99329b9be2a6e199e2"} Mar 17 00:48:26 crc kubenswrapper[4755]: I0317 00:48:26.309347 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pv6h6" podStartSLOduration=2.772645218 podStartE2EDuration="9.309329314s" podCreationTimestamp="2026-03-17 00:48:17 +0000 UTC" firstStartedPulling="2026-03-17 00:48:19.162120903 +0000 UTC m=+1573.921573186" lastFinishedPulling="2026-03-17 00:48:25.698804989 +0000 UTC m=+1580.458257282" observedRunningTime="2026-03-17 00:48:26.305998775 +0000 UTC m=+1581.065451058" watchObservedRunningTime="2026-03-17 00:48:26.309329314 +0000 UTC m=+1581.068781597" Mar 17 00:48:27 crc kubenswrapper[4755]: I0317 00:48:27.956628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:27 crc kubenswrapper[4755]: I0317 00:48:27.956963 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:29 crc kubenswrapper[4755]: I0317 00:48:29.013727 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pv6h6" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="registry-server" probeResult="failure" output=< Mar 17 00:48:29 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 00:48:29 crc kubenswrapper[4755]: > Mar 17 00:48:35 crc kubenswrapper[4755]: I0317 00:48:35.684041 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.058337 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.136359 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.288482 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.318533 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.318829 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95x6m" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="registry-server" containerID="cri-o://4165c4b2fafd06a8413712cd22b9fcaa07b4d3120681eb084fdbb8257f4a13c7" gracePeriod=2 Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.480822 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerID="4165c4b2fafd06a8413712cd22b9fcaa07b4d3120681eb084fdbb8257f4a13c7" exitCode=0 Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.481332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerDied","Data":"4165c4b2fafd06a8413712cd22b9fcaa07b4d3120681eb084fdbb8257f4a13c7"} Mar 17 00:48:38 crc kubenswrapper[4755]: I0317 00:48:38.871966 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.036499 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content\") pod \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.036575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km7bk\" (UniqueName: \"kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk\") pod \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.036737 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities\") pod \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\" (UID: \"e6d94368-96d9-44da-ac5a-af29d6b0d3df\") " Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.038144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities" (OuterVolumeSpecName: "utilities") pod "e6d94368-96d9-44da-ac5a-af29d6b0d3df" (UID: "e6d94368-96d9-44da-ac5a-af29d6b0d3df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.043845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk" (OuterVolumeSpecName: "kube-api-access-km7bk") pod "e6d94368-96d9-44da-ac5a-af29d6b0d3df" (UID: "e6d94368-96d9-44da-ac5a-af29d6b0d3df"). InnerVolumeSpecName "kube-api-access-km7bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.128642 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6d94368-96d9-44da-ac5a-af29d6b0d3df" (UID: "e6d94368-96d9-44da-ac5a-af29d6b0d3df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.138954 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.138993 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d94368-96d9-44da-ac5a-af29d6b0d3df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.139004 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km7bk\" (UniqueName: \"kubernetes.io/projected/e6d94368-96d9-44da-ac5a-af29d6b0d3df-kube-api-access-km7bk\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.492247 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95x6m" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.493592 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95x6m" event={"ID":"e6d94368-96d9-44da-ac5a-af29d6b0d3df","Type":"ContainerDied","Data":"3b72e398562c5d08f6b94cdde19fca2a8ded851782286e2245ac04c4bdfd488a"} Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.493651 4755 scope.go:117] "RemoveContainer" containerID="4165c4b2fafd06a8413712cd22b9fcaa07b4d3120681eb084fdbb8257f4a13c7" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.533874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.548954 4755 scope.go:117] "RemoveContainer" containerID="6e39b2a677bfcb03f7705830696cefb2bdb41b0dda8197067d34f9ad976fc623" Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.549725 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95x6m"] Mar 17 00:48:39 crc kubenswrapper[4755]: I0317 00:48:39.577163 4755 scope.go:117] "RemoveContainer" containerID="fe3349d5e4e7a6d140f96ac22d7c4cae9134844b89cd34d59fbadc280b59c5d3" Mar 17 00:48:40 crc kubenswrapper[4755]: I0317 00:48:40.263773 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" path="/var/lib/kubelet/pods/e6d94368-96d9-44da-ac5a-af29d6b0d3df/volumes" Mar 17 00:48:47 crc kubenswrapper[4755]: I0317 00:48:47.898392 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-gdjsh"] Mar 17 00:48:47 crc kubenswrapper[4755]: I0317 00:48:47.916151 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-gdjsh"] Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.001334 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-c6hnq"] Mar 17 00:48:48 crc kubenswrapper[4755]: E0317 00:48:48.002073 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="registry-server" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.002099 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="registry-server" Mar 17 00:48:48 crc kubenswrapper[4755]: E0317 00:48:48.002148 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="extract-content" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.002163 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="extract-content" Mar 17 00:48:48 crc kubenswrapper[4755]: E0317 00:48:48.002199 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="extract-utilities" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.002213 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="extract-utilities" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.002649 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d94368-96d9-44da-ac5a-af29d6b0d3df" containerName="registry-server" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.003945 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.017085 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-c6hnq"] Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.201514 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.202418 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzpk\" (UniqueName: \"kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.202794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.265252 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2315f493-9035-4185-b615-e7eed6a246ea" path="/var/lib/kubelet/pods/2315f493-9035-4185-b615-e7eed6a246ea/volumes" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.305482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.305591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzpk\" (UniqueName: \"kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.305664 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.314811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.314830 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.334039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzpk\" (UniqueName: \"kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk\") pod \"heat-db-sync-c6hnq\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.358754 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c6hnq" Mar 17 00:48:48 crc kubenswrapper[4755]: I0317 00:48:48.926405 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-c6hnq"] Mar 17 00:48:49 crc kubenswrapper[4755]: I0317 00:48:49.652357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c6hnq" event={"ID":"6021f21e-d701-4372-aac5-e70591f71906","Type":"ContainerStarted","Data":"5404e794f8b289eb77b82d9f368d56f0993d1de3841e7a790eec5d569010a775"} Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.009111 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.009763 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-central-agent" containerID="cri-o://28661ddf61d440162d2f2f2ae23747bcd42a631b38a45b6f76abdd1401619b09" gracePeriod=30 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.010094 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="proxy-httpd" containerID="cri-o://88d1a86a8784e93979f5d24fd9481f9231dd5948bb6bacc52b087f29d63fb36f" gracePeriod=30 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.010145 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="sg-core" containerID="cri-o://6466bb22812da1cddd6481a2ac1a6e75770aff4b8ccf86a209615d03d95bb492" gracePeriod=30 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.010166 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-notification-agent" containerID="cri-o://904638d6542cec9f48e476bd1a78979315e04950796b0f6b5a0422c7992695e6" gracePeriod=30 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.488682 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.608759 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675601 4755 generic.go:334] "Generic (PLEG): container finished" podID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerID="88d1a86a8784e93979f5d24fd9481f9231dd5948bb6bacc52b087f29d63fb36f" exitCode=0 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675649 4755 generic.go:334] "Generic (PLEG): container finished" podID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerID="6466bb22812da1cddd6481a2ac1a6e75770aff4b8ccf86a209615d03d95bb492" exitCode=2 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675658 4755 generic.go:334] "Generic (PLEG): container finished" podID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerID="28661ddf61d440162d2f2f2ae23747bcd42a631b38a45b6f76abdd1401619b09" exitCode=0 Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675676 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerDied","Data":"88d1a86a8784e93979f5d24fd9481f9231dd5948bb6bacc52b087f29d63fb36f"} Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerDied","Data":"6466bb22812da1cddd6481a2ac1a6e75770aff4b8ccf86a209615d03d95bb492"} Mar 17 00:48:50 crc kubenswrapper[4755]: I0317 00:48:50.675730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerDied","Data":"28661ddf61d440162d2f2f2ae23747bcd42a631b38a45b6f76abdd1401619b09"} Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.386999 4755 scope.go:117] "RemoveContainer" containerID="e9fea430d485f9b2eaa3aee2eb7b8e070f9bea45ff99708e98a7401ea0e25d3a" Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.767181 4755 generic.go:334] "Generic (PLEG): container finished" podID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerID="904638d6542cec9f48e476bd1a78979315e04950796b0f6b5a0422c7992695e6" exitCode=0 Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.767229 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerDied","Data":"904638d6542cec9f48e476bd1a78979315e04950796b0f6b5a0422c7992695e6"} Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.833051 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="rabbitmq" containerID="cri-o://eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53" gracePeriod=604796 Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.877655 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="rabbitmq" containerID="cri-o://5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb" gracePeriod=604796 Mar 17 00:48:54 crc kubenswrapper[4755]: I0317 00:48:54.908826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqdql\" (UniqueName: \"kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036601 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036720 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036771 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036800 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036891 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.036929 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs\") pod \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\" (UID: \"45e4fad4-5277-41a9-a93d-63f22e1c2a24\") " Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.038757 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.039079 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.042593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts" (OuterVolumeSpecName: "scripts") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.044548 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql" (OuterVolumeSpecName: "kube-api-access-pqdql") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "kube-api-access-pqdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.064982 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.126084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146596 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146627 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146635 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146647 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146677 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqdql\" (UniqueName: \"kubernetes.io/projected/45e4fad4-5277-41a9-a93d-63f22e1c2a24-kube-api-access-pqdql\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.146707 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45e4fad4-5277-41a9-a93d-63f22e1c2a24-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.159484 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.194653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data" (OuterVolumeSpecName: "config-data") pod "45e4fad4-5277-41a9-a93d-63f22e1c2a24" (UID: "45e4fad4-5277-41a9-a93d-63f22e1c2a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.249338 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.249384 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e4fad4-5277-41a9-a93d-63f22e1c2a24-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.792342 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45e4fad4-5277-41a9-a93d-63f22e1c2a24","Type":"ContainerDied","Data":"541549918db1132ff08ee56b3e4ec9faf79e6d7d6deb76096b0dacf47a789d8f"} Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.792729 4755 scope.go:117] "RemoveContainer" containerID="88d1a86a8784e93979f5d24fd9481f9231dd5948bb6bacc52b087f29d63fb36f" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.792915 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.850220 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.874643 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.877415 4755 scope.go:117] "RemoveContainer" containerID="6466bb22812da1cddd6481a2ac1a6e75770aff4b8ccf86a209615d03d95bb492" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.932630 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:55 crc kubenswrapper[4755]: E0317 00:48:55.933101 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-central-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-central-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: E0317 00:48:55.933138 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="proxy-httpd" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933145 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="proxy-httpd" Mar 17 00:48:55 crc kubenswrapper[4755]: E0317 00:48:55.933172 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="sg-core" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="sg-core" Mar 17 00:48:55 crc kubenswrapper[4755]: E0317 00:48:55.933191 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-notification-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933197 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-notification-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933419 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="proxy-httpd" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933458 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-notification-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933473 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="sg-core" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.933486 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" containerName="ceilometer-central-agent" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.935791 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.938125 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.938410 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.938702 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.946719 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.950906 4755 scope.go:117] "RemoveContainer" containerID="904638d6542cec9f48e476bd1a78979315e04950796b0f6b5a0422c7992695e6" Mar 17 00:48:55 crc kubenswrapper[4755]: I0317 00:48:55.970204 4755 scope.go:117] "RemoveContainer" containerID="28661ddf61d440162d2f2f2ae23747bcd42a631b38a45b6f76abdd1401619b09" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076331 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076401 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076431 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hwj\" (UniqueName: \"kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076611 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.076638 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.180307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67hwj\" (UniqueName: \"kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.180622 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.180740 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.180885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.180991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.181047 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.181141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.181205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.182296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.183013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.190891 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.195275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.198858 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.203534 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.211728 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hwj\" (UniqueName: \"kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.226285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts\") pod \"ceilometer-0\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.266213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.272780 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e4fad4-5277-41a9-a93d-63f22e1c2a24" path="/var/lib/kubelet/pods/45e4fad4-5277-41a9-a93d-63f22e1c2a24/volumes" Mar 17 00:48:56 crc kubenswrapper[4755]: I0317 00:48:56.808750 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 00:48:57 crc kubenswrapper[4755]: I0317 00:48:57.842785 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerStarted","Data":"aa492a567e9c8341e9bf5c53c076ca5aa8d27a85fd0e1f73fc1f187b6c92400f"} Mar 17 00:48:58 crc kubenswrapper[4755]: I0317 00:48:58.665753 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:48:58 crc kubenswrapper[4755]: I0317 00:48:58.665811 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:49:00 crc kubenswrapper[4755]: I0317 00:49:00.509757 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.121:5671: connect: connection refused" Mar 17 00:49:00 crc kubenswrapper[4755]: I0317 00:49:00.808737 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.122:5671: connect: connection refused" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.525003 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566614 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566686 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.566951 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.568009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbnb7\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.568179 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.568272 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf\") pod \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\" (UID: \"890b1d99-1a82-424e-981b-5c8ea1ae26ee\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.570002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.570787 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.571606 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.584093 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.584094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info" (OuterVolumeSpecName: "pod-info") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.606636 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.606762 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7" (OuterVolumeSpecName: "kube-api-access-lbnb7") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "kube-api-access-lbnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.607859 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.640532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data" (OuterVolumeSpecName: "config-data") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672779 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbnb7\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-kube-api-access-lbnb7\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672817 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672831 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672840 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672848 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672856 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672865 4755 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/890b1d99-1a82-424e-981b-5c8ea1ae26ee-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672874 4755 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.672882 4755 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/890b1d99-1a82-424e-981b-5c8ea1ae26ee-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.674838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf" (OuterVolumeSpecName: "server-conf") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.678310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.707311 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.773920 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.773983 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774092 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwmvb\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774135 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774199 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774271 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774292 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774447 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins\") pod \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\" (UID: \"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a\") " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774864 4755 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/890b1d99-1a82-424e-981b-5c8ea1ae26ee-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.774881 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.775539 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.776886 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.776927 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.780090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info" (OuterVolumeSpecName: "pod-info") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.781748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.785394 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb" (OuterVolumeSpecName: "kube-api-access-pwmvb") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "kube-api-access-pwmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.788652 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.795575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.825252 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data" (OuterVolumeSpecName: "config-data") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.836070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "890b1d99-1a82-424e-981b-5c8ea1ae26ee" (UID: "890b1d99-1a82-424e-981b-5c8ea1ae26ee"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876793 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876836 4755 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876847 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/890b1d99-1a82-424e-981b-5c8ea1ae26ee-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876856 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwmvb\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-kube-api-access-pwmvb\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876885 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876894 4755 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876904 4755 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876913 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876921 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.876929 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.881959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf" (OuterVolumeSpecName: "server-conf") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.892901 4755 generic.go:334] "Generic (PLEG): container finished" podID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerID="eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53" exitCode=0 Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.892957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerDied","Data":"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53"} Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.892984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"890b1d99-1a82-424e-981b-5c8ea1ae26ee","Type":"ContainerDied","Data":"7efff3d296e109a619ecefc647a2b5bc977f4f5dba6b780965f75378e5de7816"} Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.893000 4755 scope.go:117] "RemoveContainer" containerID="eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.893408 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.904465 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.911387 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerID="5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb" exitCode=0 Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.911428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerDied","Data":"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb"} Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.911504 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b3bb7d6-8094-4cef-a05b-6bad26c2d14a","Type":"ContainerDied","Data":"0e0a99034e2a6c444ebe3cb56083f1b11c8af84a6f3f10be481b79351ce70dad"} Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.911556 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.916904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" (UID: "1b3bb7d6-8094-4cef-a05b-6bad26c2d14a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.947837 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.969169 4755 scope.go:117] "RemoveContainer" containerID="fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.978575 4755 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.978607 4755 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.978617 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.984222 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.997542 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:49:01 crc kubenswrapper[4755]: E0317 00:49:01.998027 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="setup-container" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998045 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="setup-container" Mar 17 00:49:01 crc kubenswrapper[4755]: E0317 00:49:01.998069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998075 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: E0317 00:49:01.998090 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="setup-container" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="setup-container" Mar 17 00:49:01 crc kubenswrapper[4755]: E0317 00:49:01.998132 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998145 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998352 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.998379 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" containerName="rabbitmq" Mar 17 00:49:01 crc kubenswrapper[4755]: I0317 00:49:01.999700 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.002862 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.003138 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.003986 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7hx8r" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.005890 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.005981 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.005980 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.006028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.017571 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.038527 4755 scope.go:117] "RemoveContainer" containerID="eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53" Mar 17 00:49:02 crc kubenswrapper[4755]: E0317 00:49:02.039505 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53\": container with ID starting with eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53 not found: ID does not exist" containerID="eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.039571 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53"} err="failed to get container status \"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53\": rpc error: code = NotFound desc = could not find container \"eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53\": container with ID starting with eee5d0940b0de59b5ac4cea46c48aeb58e2fa5b0d22149beec7b06d3f7042f53 not found: ID does not exist" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.039618 4755 scope.go:117] "RemoveContainer" containerID="fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750" Mar 17 00:49:02 crc kubenswrapper[4755]: E0317 00:49:02.040091 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750\": container with ID starting with fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750 not found: ID does not exist" containerID="fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.040124 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750"} err="failed to get container status \"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750\": rpc error: code = NotFound desc = could not find container \"fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750\": container with ID starting with fb6d6e35774e45425e28ec26640d32c6a009930eb0767171cf6f323aec4e4750 not found: ID does not exist" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.040143 4755 scope.go:117] "RemoveContainer" containerID="5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.070532 4755 scope.go:117] "RemoveContainer" containerID="e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.181656 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-config-data\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.181946 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.181999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.182045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp2nf\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-kube-api-access-vp2nf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.182073 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183108 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/232eeb12-0802-4513-83e2-66cc0b1b398b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183229 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183301 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183365 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/232eeb12-0802-4513-83e2-66cc0b1b398b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.183402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.189234 4755 scope.go:117] "RemoveContainer" containerID="5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb" Mar 17 00:49:02 crc kubenswrapper[4755]: E0317 00:49:02.189848 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb\": container with ID starting with 5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb not found: ID does not exist" containerID="5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.189894 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb"} err="failed to get container status \"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb\": rpc error: code = NotFound desc = could not find container \"5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb\": container with ID starting with 5815c11e0dc959d875dfceae2da8d15dffa64d7f7fc6ddae7f9664c770154efb not found: ID does not exist" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.189935 4755 scope.go:117] "RemoveContainer" containerID="e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f" Mar 17 00:49:02 crc kubenswrapper[4755]: E0317 00:49:02.190596 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f\": container with ID starting with e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f not found: ID does not exist" containerID="e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.190625 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f"} err="failed to get container status \"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f\": rpc error: code = NotFound desc = could not find container \"e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f\": container with ID starting with e8f25e9b1decd539f90990899b33568e0a1f74100a38ffc2207ef6a48f57530f not found: ID does not exist" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.278978 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890b1d99-1a82-424e-981b-5c8ea1ae26ee" path="/var/lib/kubelet/pods/890b1d99-1a82-424e-981b-5c8ea1ae26ee/volumes" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.280031 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.280064 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2nf\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-kube-api-access-vp2nf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287418 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287480 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/232eeb12-0802-4513-83e2-66cc0b1b398b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287509 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287565 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287625 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/232eeb12-0802-4513-83e2-66cc0b1b398b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.287685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-config-data\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.288760 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-config-data\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.289280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.288766 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.290551 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/232eeb12-0802-4513-83e2-66cc0b1b398b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.290701 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.294546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/232eeb12-0802-4513-83e2-66cc0b1b398b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.296789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.296922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.300168 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/232eeb12-0802-4513-83e2-66cc0b1b398b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.300814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.315473 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp2nf\" (UniqueName: \"kubernetes.io/projected/232eeb12-0802-4513-83e2-66cc0b1b398b-kube-api-access-vp2nf\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.326284 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.329293 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.331807 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.331985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.332042 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.333475 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xxj7x" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.333603 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.333891 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.334090 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.337505 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"232eeb12-0802-4513-83e2-66cc0b1b398b\") " pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.339990 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490577 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c98112b6-4653-4c2e-a16e-6ddbd29fe526-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490709 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490753 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c98112b6-4653-4c2e-a16e-6ddbd29fe526-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490795 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490836 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490901 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490952 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrh2\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-kube-api-access-vnrh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.490982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.491019 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.592913 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.592995 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c98112b6-4653-4c2e-a16e-6ddbd29fe526-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c98112b6-4653-4c2e-a16e-6ddbd29fe526-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593075 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593154 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrh2\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-kube-api-access-vnrh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593249 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.593325 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.594949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.595915 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.596282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.596399 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.597817 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c98112b6-4653-4c2e-a16e-6ddbd29fe526-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.599824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c98112b6-4653-4c2e-a16e-6ddbd29fe526-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.600112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.601164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.609231 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c98112b6-4653-4c2e-a16e-6ddbd29fe526-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.614947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrh2\" (UniqueName: \"kubernetes.io/projected/c98112b6-4653-4c2e-a16e-6ddbd29fe526-kube-api-access-vnrh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.631186 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.636030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c98112b6-4653-4c2e-a16e-6ddbd29fe526\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:02 crc kubenswrapper[4755]: I0317 00:49:02.714974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:03 crc kubenswrapper[4755]: I0317 00:49:03.159070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 00:49:03 crc kubenswrapper[4755]: I0317 00:49:03.368212 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 00:49:03 crc kubenswrapper[4755]: I0317 00:49:03.992319 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"232eeb12-0802-4513-83e2-66cc0b1b398b","Type":"ContainerStarted","Data":"4165496bf91891a8fd60d62ee72ef05a877d4159a1eb1d2d9ba6f82346fed430"} Mar 17 00:49:03 crc kubenswrapper[4755]: I0317 00:49:03.993820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c98112b6-4653-4c2e-a16e-6ddbd29fe526","Type":"ContainerStarted","Data":"cfb9a1fd1a4d2fb9477a54930c3fa889b0d181372c5474cbe4e98b7be7d9b700"} Mar 17 00:49:04 crc kubenswrapper[4755]: I0317 00:49:04.270563 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3bb7d6-8094-4cef-a05b-6bad26c2d14a" path="/var/lib/kubelet/pods/1b3bb7d6-8094-4cef-a05b-6bad26c2d14a/volumes" Mar 17 00:49:06 crc kubenswrapper[4755]: I0317 00:49:06.022317 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"232eeb12-0802-4513-83e2-66cc0b1b398b","Type":"ContainerStarted","Data":"8b4e37bc03fe1bcbaf0acc9da57c7616fed1afb063a923bd454f1cd61a5df43b"} Mar 17 00:49:06 crc kubenswrapper[4755]: I0317 00:49:06.026117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c98112b6-4653-4c2e-a16e-6ddbd29fe526","Type":"ContainerStarted","Data":"b970ad4d16b58d0a44723d2612ef1311dcbfebffa299a9db06134c54640a005b"} Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.837152 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.847241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.849260 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.858412 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.949952 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950081 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64rx8\" (UniqueName: \"kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:08 crc kubenswrapper[4755]: I0317 00:49:08.950214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.051747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.051834 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.051891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.051923 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.052734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.052826 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.053034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.053247 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.053303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64rx8\" (UniqueName: \"kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.053658 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.053691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.054248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.054492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.069887 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64rx8\" (UniqueName: \"kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8\") pod \"dnsmasq-dns-68df85789f-mrx65\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:09 crc kubenswrapper[4755]: I0317 00:49:09.165885 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:18 crc kubenswrapper[4755]: E0317 00:49:18.655887 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 17 00:49:18 crc kubenswrapper[4755]: E0317 00:49:18.656601 4755 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 17 00:49:18 crc kubenswrapper[4755]: E0317 00:49:18.656791 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7bh64fh5d8h595h5bhfh65fh577h64dh67fh65fh74h67h5bdhf7h64dh5dfh94h585h585h65ch57ch674h646h64ch5dh568h67hbdhc8h7dh567q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67hwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:49:19 crc kubenswrapper[4755]: E0317 00:49:19.255863 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 17 00:49:19 crc kubenswrapper[4755]: E0317 00:49:19.256118 4755 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 17 00:49:19 crc kubenswrapper[4755]: E0317 00:49:19.256248 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wzpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-c6hnq_openstack(6021f21e-d701-4372-aac5-e70591f71906): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 00:49:19 crc kubenswrapper[4755]: E0317 00:49:19.258523 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-c6hnq" podUID="6021f21e-d701-4372-aac5-e70591f71906" Mar 17 00:49:19 crc kubenswrapper[4755]: I0317 00:49:19.721928 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:20 crc kubenswrapper[4755]: I0317 00:49:20.202001 4755 generic.go:334] "Generic (PLEG): container finished" podID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerID="c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3" exitCode=0 Mar 17 00:49:20 crc kubenswrapper[4755]: I0317 00:49:20.202080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-mrx65" event={"ID":"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c","Type":"ContainerDied","Data":"c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3"} Mar 17 00:49:20 crc kubenswrapper[4755]: I0317 00:49:20.202292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-mrx65" event={"ID":"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c","Type":"ContainerStarted","Data":"664b507c62b87df0f994497c1541d4f00c77af587da0f4ddd628d1965612ef44"} Mar 17 00:49:20 crc kubenswrapper[4755]: I0317 00:49:20.204618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerStarted","Data":"a0609fe4dc099aa0969597a5f395df67ab85042e2479a4b5f9cb65e614573d13"} Mar 17 00:49:20 crc kubenswrapper[4755]: E0317 00:49:20.206870 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-c6hnq" podUID="6021f21e-d701-4372-aac5-e70591f71906" Mar 17 00:49:21 crc kubenswrapper[4755]: I0317 00:49:21.237696 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-mrx65" event={"ID":"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c","Type":"ContainerStarted","Data":"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207"} Mar 17 00:49:21 crc kubenswrapper[4755]: I0317 00:49:21.238045 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:21 crc kubenswrapper[4755]: I0317 00:49:21.240862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerStarted","Data":"70b650ccdb95eef8fd44e4bfec6fbf5cfa3b969796c45e8ded6e9ee8fa0309c3"} Mar 17 00:49:21 crc kubenswrapper[4755]: I0317 00:49:21.272179 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68df85789f-mrx65" podStartSLOduration=13.272157391 podStartE2EDuration="13.272157391s" podCreationTimestamp="2026-03-17 00:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:49:21.255468535 +0000 UTC m=+1636.014920868" watchObservedRunningTime="2026-03-17 00:49:21.272157391 +0000 UTC m=+1636.031609674" Mar 17 00:49:23 crc kubenswrapper[4755]: E0317 00:49:23.770365 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" Mar 17 00:49:24 crc kubenswrapper[4755]: I0317 00:49:24.281737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerStarted","Data":"224e14dbc7eec156efe79008bd2fa069313ff6fe8b20055b99bf0fdaad290fea"} Mar 17 00:49:24 crc kubenswrapper[4755]: I0317 00:49:24.283073 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 00:49:24 crc kubenswrapper[4755]: E0317 00:49:24.284318 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" Mar 17 00:49:25 crc kubenswrapper[4755]: E0317 00:49:25.296588 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" Mar 17 00:49:28 crc kubenswrapper[4755]: I0317 00:49:28.664947 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:49:28 crc kubenswrapper[4755]: I0317 00:49:28.665629 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.167701 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.241248 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.241556 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="dnsmasq-dns" containerID="cri-o://da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad" gracePeriod=10 Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.466103 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.468028 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.494340 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.569734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7xb\" (UniqueName: \"kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.569874 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.569896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.569928 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.570193 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.570260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.570338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.672211 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7xb\" (UniqueName: \"kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.672322 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.672362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.672430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.672808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.673469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.674267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.674842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.674937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.675164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.676032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.676557 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.677608 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.699405 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7xb\" (UniqueName: \"kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb\") pod \"dnsmasq-dns-768b698657-5qzjn\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.785256 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.909465 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981027 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981718 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwkdq\" (UniqueName: \"kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.981992 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0\") pod \"66de11c6-3dcd-45ac-adee-be59ac746a73\" (UID: \"66de11c6-3dcd-45ac-adee-be59ac746a73\") " Mar 17 00:49:29 crc kubenswrapper[4755]: I0317 00:49:29.992646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq" (OuterVolumeSpecName: "kube-api-access-rwkdq") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "kube-api-access-rwkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.044928 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.047351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.062012 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config" (OuterVolumeSpecName: "config") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.074768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.081011 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "66de11c6-3dcd-45ac-adee-be59ac746a73" (UID: "66de11c6-3dcd-45ac-adee-be59ac746a73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085583 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085617 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085630 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085641 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085655 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwkdq\" (UniqueName: \"kubernetes.io/projected/66de11c6-3dcd-45ac-adee-be59ac746a73-kube-api-access-rwkdq\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.085667 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66de11c6-3dcd-45ac-adee-be59ac746a73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:30 crc kubenswrapper[4755]: W0317 00:49:30.353853 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cda8da0_db77_49cd_b85f_06335137c116.slice/crio-360ed184e931f4d6d67464c2d9b551e2f1fd1630946317d4406ed75f5df77b74 WatchSource:0}: Error finding container 360ed184e931f4d6d67464c2d9b551e2f1fd1630946317d4406ed75f5df77b74: Status 404 returned error can't find the container with id 360ed184e931f4d6d67464c2d9b551e2f1fd1630946317d4406ed75f5df77b74 Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.353846 4755 generic.go:334] "Generic (PLEG): container finished" podID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerID="da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad" exitCode=0 Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.353877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" event={"ID":"66de11c6-3dcd-45ac-adee-be59ac746a73","Type":"ContainerDied","Data":"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad"} Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.354123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" event={"ID":"66de11c6-3dcd-45ac-adee-be59ac746a73","Type":"ContainerDied","Data":"76d6b6ee8e802786561c000bc57323ae7ee1fd952b9055fac0a70c928e0f92d2"} Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.354150 4755 scope.go:117] "RemoveContainer" containerID="da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.353888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-2vqzm" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.362424 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.541831 4755 scope.go:117] "RemoveContainer" containerID="36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.588737 4755 scope.go:117] "RemoveContainer" containerID="da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad" Mar 17 00:49:30 crc kubenswrapper[4755]: E0317 00:49:30.589603 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad\": container with ID starting with da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad not found: ID does not exist" containerID="da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.589669 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad"} err="failed to get container status \"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad\": rpc error: code = NotFound desc = could not find container \"da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad\": container with ID starting with da849ac3d31a5d1f5811336fcc0095cad0747145a02d41554c60191ad5b972ad not found: ID does not exist" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.589698 4755 scope.go:117] "RemoveContainer" containerID="36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35" Mar 17 00:49:30 crc kubenswrapper[4755]: E0317 00:49:30.589937 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35\": container with ID starting with 36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35 not found: ID does not exist" containerID="36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.589959 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35"} err="failed to get container status \"36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35\": rpc error: code = NotFound desc = could not find container \"36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35\": container with ID starting with 36d9237d5d2dbc4490b926a9f71030e61d6fe0fcc197084ad1dfb4952763de35 not found: ID does not exist" Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.657168 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:49:30 crc kubenswrapper[4755]: I0317 00:49:30.680972 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-2vqzm"] Mar 17 00:49:31 crc kubenswrapper[4755]: I0317 00:49:31.370032 4755 generic.go:334] "Generic (PLEG): container finished" podID="7cda8da0-db77-49cd-b85f-06335137c116" containerID="409f2265631c6dcf4bc9166a1cbe58e6f5c0ea6017166c23defbb991feecdd6f" exitCode=0 Mar 17 00:49:31 crc kubenswrapper[4755]: I0317 00:49:31.370431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-5qzjn" event={"ID":"7cda8da0-db77-49cd-b85f-06335137c116","Type":"ContainerDied","Data":"409f2265631c6dcf4bc9166a1cbe58e6f5c0ea6017166c23defbb991feecdd6f"} Mar 17 00:49:31 crc kubenswrapper[4755]: I0317 00:49:31.370483 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-5qzjn" event={"ID":"7cda8da0-db77-49cd-b85f-06335137c116","Type":"ContainerStarted","Data":"360ed184e931f4d6d67464c2d9b551e2f1fd1630946317d4406ed75f5df77b74"} Mar 17 00:49:32 crc kubenswrapper[4755]: I0317 00:49:32.261668 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" path="/var/lib/kubelet/pods/66de11c6-3dcd-45ac-adee-be59ac746a73/volumes" Mar 17 00:49:32 crc kubenswrapper[4755]: I0317 00:49:32.388158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-5qzjn" event={"ID":"7cda8da0-db77-49cd-b85f-06335137c116","Type":"ContainerStarted","Data":"b7e8f96ed0a426482364cb2ffb55613f5d9662c6ba6e9a1b512dbf7566ce288c"} Mar 17 00:49:32 crc kubenswrapper[4755]: I0317 00:49:32.388323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:32 crc kubenswrapper[4755]: I0317 00:49:32.438566 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768b698657-5qzjn" podStartSLOduration=3.438540901 podStartE2EDuration="3.438540901s" podCreationTimestamp="2026-03-17 00:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:49:32.420453576 +0000 UTC m=+1647.179905879" watchObservedRunningTime="2026-03-17 00:49:32.438540901 +0000 UTC m=+1647.197993214" Mar 17 00:49:34 crc kubenswrapper[4755]: I0317 00:49:34.419337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c6hnq" event={"ID":"6021f21e-d701-4372-aac5-e70591f71906","Type":"ContainerStarted","Data":"63cff2652495ebfc7a80c379d9d370b7cfc8d9468d114d18fe4c328348131366"} Mar 17 00:49:34 crc kubenswrapper[4755]: I0317 00:49:34.448273 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-c6hnq" podStartSLOduration=2.910185901 podStartE2EDuration="47.448255822s" podCreationTimestamp="2026-03-17 00:48:47 +0000 UTC" firstStartedPulling="2026-03-17 00:48:48.938529633 +0000 UTC m=+1603.697981916" lastFinishedPulling="2026-03-17 00:49:33.476599524 +0000 UTC m=+1648.236051837" observedRunningTime="2026-03-17 00:49:34.444785518 +0000 UTC m=+1649.204237831" watchObservedRunningTime="2026-03-17 00:49:34.448255822 +0000 UTC m=+1649.207708105" Mar 17 00:49:36 crc kubenswrapper[4755]: I0317 00:49:36.478795 4755 generic.go:334] "Generic (PLEG): container finished" podID="6021f21e-d701-4372-aac5-e70591f71906" containerID="63cff2652495ebfc7a80c379d9d370b7cfc8d9468d114d18fe4c328348131366" exitCode=0 Mar 17 00:49:36 crc kubenswrapper[4755]: I0317 00:49:36.478901 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c6hnq" event={"ID":"6021f21e-d701-4372-aac5-e70591f71906","Type":"ContainerDied","Data":"63cff2652495ebfc7a80c379d9d370b7cfc8d9468d114d18fe4c328348131366"} Mar 17 00:49:37 crc kubenswrapper[4755]: I0317 00:49:37.498421 4755 generic.go:334] "Generic (PLEG): container finished" podID="232eeb12-0802-4513-83e2-66cc0b1b398b" containerID="8b4e37bc03fe1bcbaf0acc9da57c7616fed1afb063a923bd454f1cd61a5df43b" exitCode=0 Mar 17 00:49:37 crc kubenswrapper[4755]: I0317 00:49:37.498521 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"232eeb12-0802-4513-83e2-66cc0b1b398b","Type":"ContainerDied","Data":"8b4e37bc03fe1bcbaf0acc9da57c7616fed1afb063a923bd454f1cd61a5df43b"} Mar 17 00:49:37 crc kubenswrapper[4755]: I0317 00:49:37.504820 4755 generic.go:334] "Generic (PLEG): container finished" podID="c98112b6-4653-4c2e-a16e-6ddbd29fe526" containerID="b970ad4d16b58d0a44723d2612ef1311dcbfebffa299a9db06134c54640a005b" exitCode=0 Mar 17 00:49:37 crc kubenswrapper[4755]: I0317 00:49:37.504936 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c98112b6-4653-4c2e-a16e-6ddbd29fe526","Type":"ContainerDied","Data":"b970ad4d16b58d0a44723d2612ef1311dcbfebffa299a9db06134c54640a005b"} Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.009302 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c6hnq" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.166215 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data\") pod \"6021f21e-d701-4372-aac5-e70591f71906\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.166758 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wzpk\" (UniqueName: \"kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk\") pod \"6021f21e-d701-4372-aac5-e70591f71906\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.166835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle\") pod \"6021f21e-d701-4372-aac5-e70591f71906\" (UID: \"6021f21e-d701-4372-aac5-e70591f71906\") " Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.172935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk" (OuterVolumeSpecName: "kube-api-access-6wzpk") pod "6021f21e-d701-4372-aac5-e70591f71906" (UID: "6021f21e-d701-4372-aac5-e70591f71906"). InnerVolumeSpecName "kube-api-access-6wzpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.223893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6021f21e-d701-4372-aac5-e70591f71906" (UID: "6021f21e-d701-4372-aac5-e70591f71906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.269673 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wzpk\" (UniqueName: \"kubernetes.io/projected/6021f21e-d701-4372-aac5-e70591f71906-kube-api-access-6wzpk\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.269699 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.276377 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data" (OuterVolumeSpecName: "config-data") pod "6021f21e-d701-4372-aac5-e70591f71906" (UID: "6021f21e-d701-4372-aac5-e70591f71906"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.371264 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6021f21e-d701-4372-aac5-e70591f71906-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.542113 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c6hnq" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.542109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c6hnq" event={"ID":"6021f21e-d701-4372-aac5-e70591f71906","Type":"ContainerDied","Data":"5404e794f8b289eb77b82d9f368d56f0993d1de3841e7a790eec5d569010a775"} Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.542762 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5404e794f8b289eb77b82d9f368d56f0993d1de3841e7a790eec5d569010a775" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.546086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"232eeb12-0802-4513-83e2-66cc0b1b398b","Type":"ContainerStarted","Data":"12d13c50cc47d4682a74fa2dba0aa6981deeacfb74a35ddd2d7ecab853025e63"} Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.546273 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.552174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c98112b6-4653-4c2e-a16e-6ddbd29fe526","Type":"ContainerStarted","Data":"112571cbfa753cd8f93f0c17c5525dd6ee741022f5106b1942b0982d351fc3a4"} Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.552451 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:38 crc kubenswrapper[4755]: I0317 00:49:38.579085 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.579065169 podStartE2EDuration="37.579065169s" podCreationTimestamp="2026-03-17 00:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:49:38.574007811 +0000 UTC m=+1653.333460094" watchObservedRunningTime="2026-03-17 00:49:38.579065169 +0000 UTC m=+1653.338517462" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.256453 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.282545 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.282526967 podStartE2EDuration="37.282526967s" podCreationTimestamp="2026-03-17 00:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:49:38.612939895 +0000 UTC m=+1653.372392188" watchObservedRunningTime="2026-03-17 00:49:39.282526967 +0000 UTC m=+1654.041979250" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546170 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6dccf8ffb7-fvtwz"] Mar 17 00:49:39 crc kubenswrapper[4755]: E0317 00:49:39.546584 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="dnsmasq-dns" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546596 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="dnsmasq-dns" Mar 17 00:49:39 crc kubenswrapper[4755]: E0317 00:49:39.546610 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6021f21e-d701-4372-aac5-e70591f71906" containerName="heat-db-sync" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546616 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6021f21e-d701-4372-aac5-e70591f71906" containerName="heat-db-sync" Mar 17 00:49:39 crc kubenswrapper[4755]: E0317 00:49:39.546629 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="init" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546635 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="init" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546829 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="66de11c6-3dcd-45ac-adee-be59ac746a73" containerName="dnsmasq-dns" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.546850 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6021f21e-d701-4372-aac5-e70591f71906" containerName="heat-db-sync" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.547536 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.570109 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6dccf8ffb7-fvtwz"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.592995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data-custom\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.593075 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-combined-ca-bundle\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.593105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqfsd\" (UniqueName: \"kubernetes.io/projected/70e53650-f3d6-4ec4-9b49-bf34ec724c01-kube-api-access-dqfsd\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.593124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.619063 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b57d6bfb7-dfq4n"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.620984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.645674 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b57d6bfb7-dfq4n"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694673 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data-custom\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-combined-ca-bundle\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data-custom\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-public-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-combined-ca-bundle\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-internal-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqfsd\" (UniqueName: \"kubernetes.io/projected/70e53650-f3d6-4ec4-9b49-bf34ec724c01-kube-api-access-dqfsd\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.694975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.695028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwjvr\" (UniqueName: \"kubernetes.io/projected/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-kube-api-access-gwjvr\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.702986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.708246 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-combined-ca-bundle\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.715131 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-74f557fb5-t8sp4"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.717928 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.722280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70e53650-f3d6-4ec4-9b49-bf34ec724c01-config-data-custom\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.727990 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74f557fb5-t8sp4"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.731500 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqfsd\" (UniqueName: \"kubernetes.io/projected/70e53650-f3d6-4ec4-9b49-bf34ec724c01-kube-api-access-dqfsd\") pod \"heat-engine-6dccf8ffb7-fvtwz\" (UID: \"70e53650-f3d6-4ec4-9b49-bf34ec724c01\") " pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.792551 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798660 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data-custom\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798748 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-combined-ca-bundle\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-combined-ca-bundle\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798800 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229xm\" (UniqueName: \"kubernetes.io/projected/59c5d35c-0a70-4965-b0b7-704028793d5e-kube-api-access-229xm\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798831 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-public-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798856 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-internal-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798888 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-internal-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwjvr\" (UniqueName: \"kubernetes.io/projected/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-kube-api-access-gwjvr\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.798997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-public-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.799023 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data-custom\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.809272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-combined-ca-bundle\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.815326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-internal-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.821392 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.830770 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-config-data-custom\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.849068 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-public-tls-certs\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.861149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwjvr\" (UniqueName: \"kubernetes.io/projected/a03991a5-be95-4757-a3d0-4ce2fff4fdf5-kube-api-access-gwjvr\") pod \"heat-api-5b57d6bfb7-dfq4n\" (UID: \"a03991a5-be95-4757-a3d0-4ce2fff4fdf5\") " pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.894347 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.894594 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68df85789f-mrx65" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="dnsmasq-dns" containerID="cri-o://e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207" gracePeriod=10 Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901291 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-combined-ca-bundle\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901363 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229xm\" (UniqueName: \"kubernetes.io/projected/59c5d35c-0a70-4965-b0b7-704028793d5e-kube-api-access-229xm\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901465 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-internal-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-public-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901662 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data-custom\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.901720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.910455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-public-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.917127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data-custom\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.919130 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-internal-tls-certs\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.941022 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.944273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-combined-ca-bundle\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.961178 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229xm\" (UniqueName: \"kubernetes.io/projected/59c5d35c-0a70-4965-b0b7-704028793d5e-kube-api-access-229xm\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.972482 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:39 crc kubenswrapper[4755]: I0317 00:49:39.978817 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c5d35c-0a70-4965-b0b7-704028793d5e-config-data\") pod \"heat-cfnapi-74f557fb5-t8sp4\" (UID: \"59c5d35c-0a70-4965-b0b7-704028793d5e\") " pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.240514 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.461641 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.520966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521323 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521387 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64rx8\" (UniqueName: \"kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.521424 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb\") pod \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\" (UID: \"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c\") " Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.528337 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8" (OuterVolumeSpecName: "kube-api-access-64rx8") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "kube-api-access-64rx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.610391 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.619854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config" (OuterVolumeSpecName: "config") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.628424 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64rx8\" (UniqueName: \"kubernetes.io/projected/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-kube-api-access-64rx8\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.628509 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-config\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.628520 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.643522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerStarted","Data":"d3b98116772836c60e1e08a9360ec8f9b82ffe2202f53e426b293edf4ae83dc9"} Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.692175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.699076 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.702564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.705295 4755 generic.go:334] "Generic (PLEG): container finished" podID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerID="e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207" exitCode=0 Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.705343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-mrx65" event={"ID":"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c","Type":"ContainerDied","Data":"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207"} Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.705370 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-mrx65" event={"ID":"1fb7bb3e-0c4a-4a6d-9557-e11043b3646c","Type":"ContainerDied","Data":"664b507c62b87df0f994497c1541d4f00c77af587da0f4ddd628d1965612ef44"} Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.705386 4755 scope.go:117] "RemoveContainer" containerID="e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.705603 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-mrx65" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.716802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" (UID: "1fb7bb3e-0c4a-4a6d-9557-e11043b3646c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.736609 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.736643 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.736655 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.736667 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:40 crc kubenswrapper[4755]: W0317 00:49:40.752471 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c5d35c_0a70_4965_b0b7_704028793d5e.slice/crio-baeaf8fc9f0a29e4c51752ebc56d6eb604a531b146dd61a37b03f0d14d86f161 WatchSource:0}: Error finding container baeaf8fc9f0a29e4c51752ebc56d6eb604a531b146dd61a37b03f0d14d86f161: Status 404 returned error can't find the container with id baeaf8fc9f0a29e4c51752ebc56d6eb604a531b146dd61a37b03f0d14d86f161 Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.762149 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6dccf8ffb7-fvtwz"] Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.781151 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74f557fb5-t8sp4"] Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.781344 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.148723592 podStartE2EDuration="45.781326723s" podCreationTimestamp="2026-03-17 00:48:55 +0000 UTC" firstStartedPulling="2026-03-17 00:48:56.810231667 +0000 UTC m=+1611.569683950" lastFinishedPulling="2026-03-17 00:49:39.442834798 +0000 UTC m=+1654.202287081" observedRunningTime="2026-03-17 00:49:40.733134766 +0000 UTC m=+1655.492587049" watchObservedRunningTime="2026-03-17 00:49:40.781326723 +0000 UTC m=+1655.540778996" Mar 17 00:49:40 crc kubenswrapper[4755]: W0317 00:49:40.795356 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03991a5_be95_4757_a3d0_4ce2fff4fdf5.slice/crio-6570cdfb03e65d217dce305341d29cc53495ff7fc71d82614aaae2aee1a3f220 WatchSource:0}: Error finding container 6570cdfb03e65d217dce305341d29cc53495ff7fc71d82614aaae2aee1a3f220: Status 404 returned error can't find the container with id 6570cdfb03e65d217dce305341d29cc53495ff7fc71d82614aaae2aee1a3f220 Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.807092 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b57d6bfb7-dfq4n"] Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.883451 4755 scope.go:117] "RemoveContainer" containerID="c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.918955 4755 scope.go:117] "RemoveContainer" containerID="e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207" Mar 17 00:49:40 crc kubenswrapper[4755]: E0317 00:49:40.919800 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207\": container with ID starting with e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207 not found: ID does not exist" containerID="e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.919944 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207"} err="failed to get container status \"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207\": rpc error: code = NotFound desc = could not find container \"e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207\": container with ID starting with e1367e756c049194b002765c4a2f6edd7377d854d527318fc7aff057422f6207 not found: ID does not exist" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.920017 4755 scope.go:117] "RemoveContainer" containerID="c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3" Mar 17 00:49:40 crc kubenswrapper[4755]: E0317 00:49:40.920571 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3\": container with ID starting with c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3 not found: ID does not exist" containerID="c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3" Mar 17 00:49:40 crc kubenswrapper[4755]: I0317 00:49:40.920628 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3"} err="failed to get container status \"c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3\": rpc error: code = NotFound desc = could not find container \"c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3\": container with ID starting with c39899e7e03a40f3505ca863c72441e8b9663a7cecc4c0e64e5986de0ef183d3 not found: ID does not exist" Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.050126 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.070515 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-mrx65"] Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.737881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" event={"ID":"70e53650-f3d6-4ec4-9b49-bf34ec724c01","Type":"ContainerStarted","Data":"93864865480b1267834826fcf3df0b051e37dc12ce68ce56e2a2fe0346f34675"} Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.738224 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" event={"ID":"70e53650-f3d6-4ec4-9b49-bf34ec724c01","Type":"ContainerStarted","Data":"25b45ed57a7282fbbe1201bb9964b9a02d3ccc808858a491b8eecd5ff799500d"} Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.738471 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.743764 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b57d6bfb7-dfq4n" event={"ID":"a03991a5-be95-4757-a3d0-4ce2fff4fdf5","Type":"ContainerStarted","Data":"6570cdfb03e65d217dce305341d29cc53495ff7fc71d82614aaae2aee1a3f220"} Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.748052 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" event={"ID":"59c5d35c-0a70-4965-b0b7-704028793d5e","Type":"ContainerStarted","Data":"baeaf8fc9f0a29e4c51752ebc56d6eb604a531b146dd61a37b03f0d14d86f161"} Mar 17 00:49:41 crc kubenswrapper[4755]: I0317 00:49:41.774702 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" podStartSLOduration=2.774685535 podStartE2EDuration="2.774685535s" podCreationTimestamp="2026-03-17 00:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 00:49:41.757575897 +0000 UTC m=+1656.517028200" watchObservedRunningTime="2026-03-17 00:49:41.774685535 +0000 UTC m=+1656.534137818" Mar 17 00:49:42 crc kubenswrapper[4755]: I0317 00:49:42.261366 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" path="/var/lib/kubelet/pods/1fb7bb3e-0c4a-4a6d-9557-e11043b3646c/volumes" Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.766160 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b57d6bfb7-dfq4n" event={"ID":"a03991a5-be95-4757-a3d0-4ce2fff4fdf5","Type":"ContainerStarted","Data":"f404621edc5af9b30a056efe494bed8ef074d0016559e9708b48aba52dfd61a3"} Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.766843 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.768880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" event={"ID":"59c5d35c-0a70-4965-b0b7-704028793d5e","Type":"ContainerStarted","Data":"86632de0bfcb49aea45f13ca7aa3d8d62faec3806fb471d3e954aa701f6b5710"} Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.769012 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.817192 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b57d6bfb7-dfq4n" podStartSLOduration=2.754373329 podStartE2EDuration="4.817174902s" podCreationTimestamp="2026-03-17 00:49:39 +0000 UTC" firstStartedPulling="2026-03-17 00:49:40.821973594 +0000 UTC m=+1655.581425877" lastFinishedPulling="2026-03-17 00:49:42.884775167 +0000 UTC m=+1657.644227450" observedRunningTime="2026-03-17 00:49:43.792800666 +0000 UTC m=+1658.552252949" watchObservedRunningTime="2026-03-17 00:49:43.817174902 +0000 UTC m=+1658.576627185" Mar 17 00:49:43 crc kubenswrapper[4755]: I0317 00:49:43.822736 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" podStartSLOduration=2.702192562 podStartE2EDuration="4.822718623s" podCreationTimestamp="2026-03-17 00:49:39 +0000 UTC" firstStartedPulling="2026-03-17 00:49:40.763066114 +0000 UTC m=+1655.522518397" lastFinishedPulling="2026-03-17 00:49:42.883592175 +0000 UTC m=+1657.643044458" observedRunningTime="2026-03-17 00:49:43.809940884 +0000 UTC m=+1658.569393167" watchObservedRunningTime="2026-03-17 00:49:43.822718623 +0000 UTC m=+1658.582170906" Mar 17 00:49:51 crc kubenswrapper[4755]: I0317 00:49:51.296685 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5b57d6bfb7-dfq4n" Mar 17 00:49:51 crc kubenswrapper[4755]: I0317 00:49:51.347426 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:49:51 crc kubenswrapper[4755]: I0317 00:49:51.347650 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7f4c59689b-88k7q" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerName="heat-api" containerID="cri-o://4878c6fedfd583b26c183a96cf890bb54665c443e20ef74c5c09fb930f2981b0" gracePeriod=60 Mar 17 00:49:51 crc kubenswrapper[4755]: I0317 00:49:51.922067 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-74f557fb5-t8sp4" Mar 17 00:49:52 crc kubenswrapper[4755]: I0317 00:49:52.015282 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:49:52 crc kubenswrapper[4755]: I0317 00:49:52.015543 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-958555d6d-xlr8p" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerName="heat-cfnapi" containerID="cri-o://e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063" gracePeriod=60 Mar 17 00:49:52 crc kubenswrapper[4755]: I0317 00:49:52.634578 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 00:49:52 crc kubenswrapper[4755]: I0317 00:49:52.719613 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.549630 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7f4c59689b-88k7q" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.216:8004/healthcheck\": read tcp 10.217.0.2:38530->10.217.0.216:8004: read: connection reset by peer" Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.856305 4755 scope.go:117] "RemoveContainer" containerID="19a612e42a3a1178e6eb712baff292bc8da4dbe9a60703eefa0cc84cc618cd8e" Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.897670 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerID="4878c6fedfd583b26c183a96cf890bb54665c443e20ef74c5c09fb930f2981b0" exitCode=0 Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.897717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f4c59689b-88k7q" event={"ID":"3b12002d-6940-4bb5-83d0-86bc6add52f8","Type":"ContainerDied","Data":"4878c6fedfd583b26c183a96cf890bb54665c443e20ef74c5c09fb930f2981b0"} Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.899370 4755 scope.go:117] "RemoveContainer" containerID="45a52e23186cf42097c72b076428529bf89edc62cc39e66e3c5e059d86402650" Mar 17 00:49:54 crc kubenswrapper[4755]: I0317 00:49:54.974064 4755 scope.go:117] "RemoveContainer" containerID="c2ddae9046f6a980f7216acbb3c9c0eaac0b1d3202ca84e39004761fce6b9145" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.007053 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.017474 4755 scope.go:117] "RemoveContainer" containerID="7227e684059e39a3e3bcdaa7122d87098759c557a3e1d664ec9569d05ed2eb3a" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.066711 4755 scope.go:117] "RemoveContainer" containerID="bba9337a0a86997f5b8df36258f89b93c8084d68107a2458fb0b8a3c78015f50" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144206 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144455 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.144533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdzxt\" (UniqueName: \"kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt\") pod \"3b12002d-6940-4bb5-83d0-86bc6add52f8\" (UID: \"3b12002d-6940-4bb5-83d0-86bc6add52f8\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.153647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt" (OuterVolumeSpecName: "kube-api-access-tdzxt") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "kube-api-access-tdzxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.155789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.187472 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.202633 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-958555d6d-xlr8p" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.217:8000/healthcheck\": read tcp 10.217.0.2:54868->10.217.0.217:8000: read: connection reset by peer" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.235815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.245579 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data" (OuterVolumeSpecName: "config-data") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.246931 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.246959 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.246970 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.246980 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdzxt\" (UniqueName: \"kubernetes.io/projected/3b12002d-6940-4bb5-83d0-86bc6add52f8-kube-api-access-tdzxt\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.246991 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.266772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b12002d-6940-4bb5-83d0-86bc6add52f8" (UID: "3b12002d-6940-4bb5-83d0-86bc6add52f8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.349737 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b12002d-6940-4bb5-83d0-86bc6add52f8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.745299 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858177 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858300 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858470 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.858554 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrxpk\" (UniqueName: \"kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk\") pod \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\" (UID: \"cbba9375-8a42-4c43-9b62-0b2df2e89af1\") " Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.864667 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.869646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk" (OuterVolumeSpecName: "kube-api-access-zrxpk") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "kube-api-access-zrxpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.909951 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f4c59689b-88k7q" event={"ID":"3b12002d-6940-4bb5-83d0-86bc6add52f8","Type":"ContainerDied","Data":"c061768949fcb4cec5afa25f4e0ede8d4863d2172fe14fa531b4def0b82e30e2"} Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.910021 4755 scope.go:117] "RemoveContainer" containerID="4878c6fedfd583b26c183a96cf890bb54665c443e20ef74c5c09fb930f2981b0" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.910134 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f4c59689b-88k7q" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.910448 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.969901 4755 generic.go:334] "Generic (PLEG): container finished" podID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerID="e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063" exitCode=0 Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.969947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-958555d6d-xlr8p" event={"ID":"cbba9375-8a42-4c43-9b62-0b2df2e89af1","Type":"ContainerDied","Data":"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063"} Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.969974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-958555d6d-xlr8p" event={"ID":"cbba9375-8a42-4c43-9b62-0b2df2e89af1","Type":"ContainerDied","Data":"e15bd65c7cafc484012e9ad7b02e586a9f88eeb59d7bbca449ba914f9909cdcd"} Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.970053 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-958555d6d-xlr8p" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.981045 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.981075 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrxpk\" (UniqueName: \"kubernetes.io/projected/cbba9375-8a42-4c43-9b62-0b2df2e89af1-kube-api-access-zrxpk\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:55 crc kubenswrapper[4755]: I0317 00:49:55.981085 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.094164 4755 scope.go:117] "RemoveContainer" containerID="e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.094333 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.103915 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.104332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.105670 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data" (OuterVolumeSpecName: "config-data") pod "cbba9375-8a42-4c43-9b62-0b2df2e89af1" (UID: "cbba9375-8a42-4c43-9b62-0b2df2e89af1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.114653 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7f4c59689b-88k7q"] Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.165921 4755 scope.go:117] "RemoveContainer" containerID="e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063" Mar 17 00:49:56 crc kubenswrapper[4755]: E0317 00:49:56.183704 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063\": container with ID starting with e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063 not found: ID does not exist" containerID="e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.183760 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063"} err="failed to get container status \"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063\": rpc error: code = NotFound desc = could not find container \"e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063\": container with ID starting with e90ed2fc0f167a7ddfec4ecb03def223498e6aa9a1c3f7aa6dba3f6c4979e063 not found: ID does not exist" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.192731 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.192768 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.192779 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbba9375-8a42-4c43-9b62-0b2df2e89af1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.319234 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" path="/var/lib/kubelet/pods/3b12002d-6940-4bb5-83d0-86bc6add52f8/volumes" Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.334506 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:49:56 crc kubenswrapper[4755]: I0317 00:49:56.352938 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-958555d6d-xlr8p"] Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.923698 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn"] Mar 17 00:49:57 crc kubenswrapper[4755]: E0317 00:49:57.925046 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerName="heat-api" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925072 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerName="heat-api" Mar 17 00:49:57 crc kubenswrapper[4755]: E0317 00:49:57.925095 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="dnsmasq-dns" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925108 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="dnsmasq-dns" Mar 17 00:49:57 crc kubenswrapper[4755]: E0317 00:49:57.925129 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="init" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925141 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="init" Mar 17 00:49:57 crc kubenswrapper[4755]: E0317 00:49:57.925194 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerName="heat-cfnapi" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925206 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerName="heat-cfnapi" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925605 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" containerName="heat-cfnapi" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925633 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb7bb3e-0c4a-4a6d-9557-e11043b3646c" containerName="dnsmasq-dns" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.925655 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b12002d-6940-4bb5-83d0-86bc6add52f8" containerName="heat-api" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.926908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.934611 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.934859 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.934991 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.936300 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.937302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.937421 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.937787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.937881 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpknn\" (UniqueName: \"kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:57 crc kubenswrapper[4755]: I0317 00:49:57.938730 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn"] Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.044373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.044518 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpknn\" (UniqueName: \"kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.044588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.044642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.050358 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.050481 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.050833 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.067840 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpknn\" (UniqueName: \"kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.264588 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbba9375-8a42-4c43-9b62-0b2df2e89af1" path="/var/lib/kubelet/pods/cbba9375-8a42-4c43-9b62-0b2df2e89af1/volumes" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.271422 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.665559 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.665959 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.666018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.667054 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.667126 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" gracePeriod=600 Mar 17 00:49:58 crc kubenswrapper[4755]: E0317 00:49:58.789511 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:49:58 crc kubenswrapper[4755]: I0317 00:49:58.942131 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn"] Mar 17 00:49:58 crc kubenswrapper[4755]: W0317 00:49:58.942888 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode18a331f_c8bb_46a9_ae90_38ffc6104a4d.slice/crio-c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85 WatchSource:0}: Error finding container c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85: Status 404 returned error can't find the container with id c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85 Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.023960 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" exitCode=0 Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.024046 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834"} Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.024098 4755 scope.go:117] "RemoveContainer" containerID="38fb594cd84460a45d3465d21f6d2658b58fe7d697877c14788a3d78ce3aa72f" Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.024827 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:49:59 crc kubenswrapper[4755]: E0317 00:49:59.025144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.026112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" event={"ID":"e18a331f-c8bb-46a9-ae90-38ffc6104a4d","Type":"ContainerStarted","Data":"c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85"} Mar 17 00:49:59 crc kubenswrapper[4755]: I0317 00:49:59.970255 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6dccf8ffb7-fvtwz" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.037413 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.037607 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-644dcb55b6-q7jd4" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" containerID="cri-o://9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" gracePeriod=60 Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.153191 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561810-tgzzp"] Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.154607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.157721 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.157928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.158067 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.163097 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561810-tgzzp"] Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.291559 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl45d\" (UniqueName: \"kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d\") pod \"auto-csr-approver-29561810-tgzzp\" (UID: \"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e\") " pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.393914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl45d\" (UniqueName: \"kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d\") pod \"auto-csr-approver-29561810-tgzzp\" (UID: \"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e\") " pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.413508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl45d\" (UniqueName: \"kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d\") pod \"auto-csr-approver-29561810-tgzzp\" (UID: \"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e\") " pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:00 crc kubenswrapper[4755]: I0317 00:50:00.501729 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:01 crc kubenswrapper[4755]: I0317 00:50:01.075010 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561810-tgzzp"] Mar 17 00:50:02 crc kubenswrapper[4755]: I0317 00:50:02.077524 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" event={"ID":"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e","Type":"ContainerStarted","Data":"7de848cfaff32647bf72fccf3cec6346301fb1968ae158e9ce420ab2658dd5f5"} Mar 17 00:50:03 crc kubenswrapper[4755]: I0317 00:50:03.111662 4755 generic.go:334] "Generic (PLEG): container finished" podID="1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" containerID="17408614c84f9a3809d69c3d47eba304724c756bf6d51f7772b56e073b2636bd" exitCode=0 Mar 17 00:50:03 crc kubenswrapper[4755]: I0317 00:50:03.112214 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" event={"ID":"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e","Type":"ContainerDied","Data":"17408614c84f9a3809d69c3d47eba304724c756bf6d51f7772b56e073b2636bd"} Mar 17 00:50:05 crc kubenswrapper[4755]: E0317 00:50:05.803253 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:05 crc kubenswrapper[4755]: E0317 00:50:05.804563 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:05 crc kubenswrapper[4755]: E0317 00:50:05.805643 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:05 crc kubenswrapper[4755]: E0317 00:50:05.805719 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-644dcb55b6-q7jd4" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.170062 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-9fw58"] Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.178196 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-9fw58"] Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.291330 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1f9758-4126-425e-863f-23dbf247cc32" path="/var/lib/kubelet/pods/da1f9758-4126-425e-863f-23dbf247cc32/volumes" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.292193 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-q9ck4"] Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.296089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.297019 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q9ck4"] Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.299080 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.376492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2km\" (UniqueName: \"kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.376827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.376875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.377129 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.479625 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2km\" (UniqueName: \"kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.479839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.479864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.481148 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.488631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.495153 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.502466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.505322 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2km\" (UniqueName: \"kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km\") pod \"aodh-db-sync-q9ck4\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:08 crc kubenswrapper[4755]: I0317 00:50:08.624399 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:09 crc kubenswrapper[4755]: I0317 00:50:09.248757 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:50:09 crc kubenswrapper[4755]: E0317 00:50:09.249059 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.104240 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.224765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" event={"ID":"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e","Type":"ContainerDied","Data":"7de848cfaff32647bf72fccf3cec6346301fb1968ae158e9ce420ab2658dd5f5"} Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.225068 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de848cfaff32647bf72fccf3cec6346301fb1968ae158e9ce420ab2658dd5f5" Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.306763 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.447568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl45d\" (UniqueName: \"kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d\") pod \"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e\" (UID: \"1c560869-7a4d-41f5-a9aa-3d57d3f0be2e\") " Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.451602 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d" (OuterVolumeSpecName: "kube-api-access-gl45d") pod "1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" (UID: "1c560869-7a4d-41f5-a9aa-3d57d3f0be2e"). InnerVolumeSpecName "kube-api-access-gl45d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.506449 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q9ck4"] Mar 17 00:50:10 crc kubenswrapper[4755]: W0317 00:50:10.521099 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3329d5af_f897_4463_b17d_cbe601800d3c.slice/crio-c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b WatchSource:0}: Error finding container c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b: Status 404 returned error can't find the container with id c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b Mar 17 00:50:10 crc kubenswrapper[4755]: I0317 00:50:10.549908 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl45d\" (UniqueName: \"kubernetes.io/projected/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e-kube-api-access-gl45d\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.236594 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q9ck4" event={"ID":"3329d5af-f897-4463-b17d-cbe601800d3c","Type":"ContainerStarted","Data":"c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b"} Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.242780 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561810-tgzzp" Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.242948 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" event={"ID":"e18a331f-c8bb-46a9-ae90-38ffc6104a4d","Type":"ContainerStarted","Data":"fb1fdf85cce1c6d66554fe1daa52304cdabefce08318c8ac6e59cd67c50b869f"} Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.284483 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" podStartSLOduration=3.131345458 podStartE2EDuration="14.284462335s" podCreationTimestamp="2026-03-17 00:49:57 +0000 UTC" firstStartedPulling="2026-03-17 00:49:58.948680102 +0000 UTC m=+1673.708132395" lastFinishedPulling="2026-03-17 00:50:10.101796989 +0000 UTC m=+1684.861249272" observedRunningTime="2026-03-17 00:50:11.270921625 +0000 UTC m=+1686.030373908" watchObservedRunningTime="2026-03-17 00:50:11.284462335 +0000 UTC m=+1686.043914638" Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.378093 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561804-jbzg7"] Mar 17 00:50:11 crc kubenswrapper[4755]: I0317 00:50:11.390510 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561804-jbzg7"] Mar 17 00:50:12 crc kubenswrapper[4755]: I0317 00:50:12.262397 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab9a2f0-b327-4c8a-a7ec-97930918a1ac" path="/var/lib/kubelet/pods/6ab9a2f0-b327-4c8a-a7ec-97930918a1ac/volumes" Mar 17 00:50:15 crc kubenswrapper[4755]: E0317 00:50:15.802075 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5 is running failed: container process not found" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:15 crc kubenswrapper[4755]: E0317 00:50:15.803678 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5 is running failed: container process not found" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:15 crc kubenswrapper[4755]: E0317 00:50:15.804214 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5 is running failed: container process not found" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 17 00:50:15 crc kubenswrapper[4755]: E0317 00:50:15.804291 4755 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-644dcb55b6-q7jd4" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.316772 4755 generic.go:334] "Generic (PLEG): container finished" podID="180ff5f7-b121-458f-b938-d06977e1f610" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" exitCode=0 Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.317125 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644dcb55b6-q7jd4" event={"ID":"180ff5f7-b121-458f-b938-d06977e1f610","Type":"ContainerDied","Data":"9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5"} Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.691087 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.795783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz5hr\" (UniqueName: \"kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr\") pod \"180ff5f7-b121-458f-b938-d06977e1f610\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.795920 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data\") pod \"180ff5f7-b121-458f-b938-d06977e1f610\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.796135 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom\") pod \"180ff5f7-b121-458f-b938-d06977e1f610\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.796620 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle\") pod \"180ff5f7-b121-458f-b938-d06977e1f610\" (UID: \"180ff5f7-b121-458f-b938-d06977e1f610\") " Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.801257 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr" (OuterVolumeSpecName: "kube-api-access-gz5hr") pod "180ff5f7-b121-458f-b938-d06977e1f610" (UID: "180ff5f7-b121-458f-b938-d06977e1f610"). InnerVolumeSpecName "kube-api-access-gz5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.801424 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "180ff5f7-b121-458f-b938-d06977e1f610" (UID: "180ff5f7-b121-458f-b938-d06977e1f610"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.827568 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "180ff5f7-b121-458f-b938-d06977e1f610" (UID: "180ff5f7-b121-458f-b938-d06977e1f610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.850146 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data" (OuterVolumeSpecName: "config-data") pod "180ff5f7-b121-458f-b938-d06977e1f610" (UID: "180ff5f7-b121-458f-b938-d06977e1f610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.899618 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.899681 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.899702 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz5hr\" (UniqueName: \"kubernetes.io/projected/180ff5f7-b121-458f-b938-d06977e1f610-kube-api-access-gz5hr\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:16 crc kubenswrapper[4755]: I0317 00:50:16.899722 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180ff5f7-b121-458f-b938-d06977e1f610-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.331290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-644dcb55b6-q7jd4" event={"ID":"180ff5f7-b121-458f-b938-d06977e1f610","Type":"ContainerDied","Data":"6d3fe709f3c796e95850e0206909d027c06bd02b294da028585ab72b30461eed"} Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.331711 4755 scope.go:117] "RemoveContainer" containerID="9d500d820d304ce3f1d720ab213a4d6a390561511c5b7cb07c191c0da5b4fae5" Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.331517 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-644dcb55b6-q7jd4" Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.335127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q9ck4" event={"ID":"3329d5af-f897-4463-b17d-cbe601800d3c","Type":"ContainerStarted","Data":"9a654e3714e51944c58cac1513b653fbf3b4c1dad674e0d21acc4fc6a5054c7e"} Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.366913 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-q9ck4" podStartSLOduration=3.7415534790000002 podStartE2EDuration="9.366893126s" podCreationTimestamp="2026-03-17 00:50:08 +0000 UTC" firstStartedPulling="2026-03-17 00:50:10.524187855 +0000 UTC m=+1685.283640138" lastFinishedPulling="2026-03-17 00:50:16.149527502 +0000 UTC m=+1690.908979785" observedRunningTime="2026-03-17 00:50:17.360294785 +0000 UTC m=+1692.119747068" watchObservedRunningTime="2026-03-17 00:50:17.366893126 +0000 UTC m=+1692.126345409" Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.389696 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:50:17 crc kubenswrapper[4755]: I0317 00:50:17.405115 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-644dcb55b6-q7jd4"] Mar 17 00:50:18 crc kubenswrapper[4755]: I0317 00:50:18.282185 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="180ff5f7-b121-458f-b938-d06977e1f610" path="/var/lib/kubelet/pods/180ff5f7-b121-458f-b938-d06977e1f610/volumes" Mar 17 00:50:20 crc kubenswrapper[4755]: I0317 00:50:19.373047 4755 generic.go:334] "Generic (PLEG): container finished" podID="3329d5af-f897-4463-b17d-cbe601800d3c" containerID="9a654e3714e51944c58cac1513b653fbf3b4c1dad674e0d21acc4fc6a5054c7e" exitCode=0 Mar 17 00:50:20 crc kubenswrapper[4755]: I0317 00:50:19.373093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q9ck4" event={"ID":"3329d5af-f897-4463-b17d-cbe601800d3c","Type":"ContainerDied","Data":"9a654e3714e51944c58cac1513b653fbf3b4c1dad674e0d21acc4fc6a5054c7e"} Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.251752 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:50:21 crc kubenswrapper[4755]: E0317 00:50:21.252762 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.357888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.401930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q9ck4" event={"ID":"3329d5af-f897-4463-b17d-cbe601800d3c","Type":"ContainerDied","Data":"c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b"} Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.401972 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c043f08c9ea8c80b53f712599c01cf75913a120001a1b9c4a0dee422d586a76b" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.402509 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q9ck4" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.462391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle\") pod \"3329d5af-f897-4463-b17d-cbe601800d3c\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.462457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data\") pod \"3329d5af-f897-4463-b17d-cbe601800d3c\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.462487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr2km\" (UniqueName: \"kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km\") pod \"3329d5af-f897-4463-b17d-cbe601800d3c\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.462680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts\") pod \"3329d5af-f897-4463-b17d-cbe601800d3c\" (UID: \"3329d5af-f897-4463-b17d-cbe601800d3c\") " Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.467845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km" (OuterVolumeSpecName: "kube-api-access-tr2km") pod "3329d5af-f897-4463-b17d-cbe601800d3c" (UID: "3329d5af-f897-4463-b17d-cbe601800d3c"). InnerVolumeSpecName "kube-api-access-tr2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.477736 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts" (OuterVolumeSpecName: "scripts") pod "3329d5af-f897-4463-b17d-cbe601800d3c" (UID: "3329d5af-f897-4463-b17d-cbe601800d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.491002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3329d5af-f897-4463-b17d-cbe601800d3c" (UID: "3329d5af-f897-4463-b17d-cbe601800d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.519389 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data" (OuterVolumeSpecName: "config-data") pod "3329d5af-f897-4463-b17d-cbe601800d3c" (UID: "3329d5af-f897-4463-b17d-cbe601800d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.565416 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.565483 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.565496 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329d5af-f897-4463-b17d-cbe601800d3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:21 crc kubenswrapper[4755]: I0317 00:50:21.565506 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr2km\" (UniqueName: \"kubernetes.io/projected/3329d5af-f897-4463-b17d-cbe601800d3c-kube-api-access-tr2km\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:22 crc kubenswrapper[4755]: I0317 00:50:22.416558 4755 generic.go:334] "Generic (PLEG): container finished" podID="e18a331f-c8bb-46a9-ae90-38ffc6104a4d" containerID="fb1fdf85cce1c6d66554fe1daa52304cdabefce08318c8ac6e59cd67c50b869f" exitCode=0 Mar 17 00:50:22 crc kubenswrapper[4755]: I0317 00:50:22.416647 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" event={"ID":"e18a331f-c8bb-46a9-ae90-38ffc6104a4d","Type":"ContainerDied","Data":"fb1fdf85cce1c6d66554fe1daa52304cdabefce08318c8ac6e59cd67c50b869f"} Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.445944 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.446412 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-api" containerID="cri-o://70bddc077d3511cc9064594aef0f7e8a253e1aa1e2a8106a77471972eeea8e12" gracePeriod=30 Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.446920 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-evaluator" containerID="cri-o://c47c6f774e1ffc33a220736fe324ab9a87ea0df635326870d923007284d26426" gracePeriod=30 Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.446958 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-notifier" containerID="cri-o://0a5c23131230a386b5676ad6a14217859b8188793c81bcd68455cc986f58ec34" gracePeriod=30 Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.447083 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-listener" containerID="cri-o://c30036fbb6c41d2c7dada4fcf02401c9a4dffe0167f192a9cce1907ad319ef2a" gracePeriod=30 Mar 17 00:50:23 crc kubenswrapper[4755]: I0317 00:50:23.986740 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.122900 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam\") pod \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.123025 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpknn\" (UniqueName: \"kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn\") pod \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.123132 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle\") pod \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.123203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory\") pod \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\" (UID: \"e18a331f-c8bb-46a9-ae90-38ffc6104a4d\") " Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.130768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn" (OuterVolumeSpecName: "kube-api-access-xpknn") pod "e18a331f-c8bb-46a9-ae90-38ffc6104a4d" (UID: "e18a331f-c8bb-46a9-ae90-38ffc6104a4d"). InnerVolumeSpecName "kube-api-access-xpknn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.132235 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e18a331f-c8bb-46a9-ae90-38ffc6104a4d" (UID: "e18a331f-c8bb-46a9-ae90-38ffc6104a4d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.157844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory" (OuterVolumeSpecName: "inventory") pod "e18a331f-c8bb-46a9-ae90-38ffc6104a4d" (UID: "e18a331f-c8bb-46a9-ae90-38ffc6104a4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.175955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e18a331f-c8bb-46a9-ae90-38ffc6104a4d" (UID: "e18a331f-c8bb-46a9-ae90-38ffc6104a4d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.225814 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.225848 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpknn\" (UniqueName: \"kubernetes.io/projected/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-kube-api-access-xpknn\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.225858 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.225868 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e18a331f-c8bb-46a9-ae90-38ffc6104a4d-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.454261 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" event={"ID":"e18a331f-c8bb-46a9-ae90-38ffc6104a4d","Type":"ContainerDied","Data":"c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85"} Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.454304 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2599c3839f92e72461f621809073773197256462d739e4239d8deed442a3a85" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.454363 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.460008 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef911263-adb5-4a18-b726-888ec33cb66a" containerID="c47c6f774e1ffc33a220736fe324ab9a87ea0df635326870d923007284d26426" exitCode=0 Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.460037 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef911263-adb5-4a18-b726-888ec33cb66a" containerID="70bddc077d3511cc9064594aef0f7e8a253e1aa1e2a8106a77471972eeea8e12" exitCode=0 Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.460059 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerDied","Data":"c47c6f774e1ffc33a220736fe324ab9a87ea0df635326870d923007284d26426"} Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.460084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerDied","Data":"70bddc077d3511cc9064594aef0f7e8a253e1aa1e2a8106a77471972eeea8e12"} Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.596620 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp"] Mar 17 00:50:24 crc kubenswrapper[4755]: E0317 00:50:24.597012 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597027 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" Mar 17 00:50:24 crc kubenswrapper[4755]: E0317 00:50:24.597050 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" containerName="oc" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597057 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" containerName="oc" Mar 17 00:50:24 crc kubenswrapper[4755]: E0317 00:50:24.597069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18a331f-c8bb-46a9-ae90-38ffc6104a4d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597077 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18a331f-c8bb-46a9-ae90-38ffc6104a4d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 00:50:24 crc kubenswrapper[4755]: E0317 00:50:24.597090 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3329d5af-f897-4463-b17d-cbe601800d3c" containerName="aodh-db-sync" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597096 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3329d5af-f897-4463-b17d-cbe601800d3c" containerName="aodh-db-sync" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597285 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="180ff5f7-b121-458f-b938-d06977e1f610" containerName="heat-engine" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597301 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" containerName="oc" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597311 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18a331f-c8bb-46a9-ae90-38ffc6104a4d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597324 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3329d5af-f897-4463-b17d-cbe601800d3c" containerName="aodh-db-sync" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.597987 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.601549 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.605573 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.605901 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.606283 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.609784 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp"] Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.737600 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw85\" (UniqueName: \"kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.737927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.737956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.737984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.840197 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw85\" (UniqueName: \"kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.840314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.840359 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.840414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.845919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.846147 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.847427 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.867954 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw85\" (UniqueName: \"kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:24 crc kubenswrapper[4755]: I0317 00:50:24.912251 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:50:25 crc kubenswrapper[4755]: I0317 00:50:25.547625 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp"] Mar 17 00:50:25 crc kubenswrapper[4755]: I0317 00:50:25.555046 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:50:26 crc kubenswrapper[4755]: I0317 00:50:26.511885 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" event={"ID":"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9","Type":"ContainerStarted","Data":"41f7a57d57a0d6adecf75db40817a16821640e3614f4d1b0e21ad7474962c0c0"} Mar 17 00:50:26 crc kubenswrapper[4755]: I0317 00:50:26.512544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" event={"ID":"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9","Type":"ContainerStarted","Data":"bacad24bad05d9c6dfc13a26a982d52776ce4435738bc2fd8f26e5f5d5bae8f9"} Mar 17 00:50:26 crc kubenswrapper[4755]: I0317 00:50:26.540193 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" podStartSLOduration=2.129620898 podStartE2EDuration="2.540172349s" podCreationTimestamp="2026-03-17 00:50:24 +0000 UTC" firstStartedPulling="2026-03-17 00:50:25.554661632 +0000 UTC m=+1700.314113925" lastFinishedPulling="2026-03-17 00:50:25.965213073 +0000 UTC m=+1700.724665376" observedRunningTime="2026-03-17 00:50:26.529346073 +0000 UTC m=+1701.288798386" watchObservedRunningTime="2026-03-17 00:50:26.540172349 +0000 UTC m=+1701.299624642" Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.567846 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef911263-adb5-4a18-b726-888ec33cb66a" containerID="c30036fbb6c41d2c7dada4fcf02401c9a4dffe0167f192a9cce1907ad319ef2a" exitCode=0 Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.568284 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef911263-adb5-4a18-b726-888ec33cb66a" containerID="0a5c23131230a386b5676ad6a14217859b8188793c81bcd68455cc986f58ec34" exitCode=0 Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.567950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerDied","Data":"c30036fbb6c41d2c7dada4fcf02401c9a4dffe0167f192a9cce1907ad319ef2a"} Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.568328 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerDied","Data":"0a5c23131230a386b5676ad6a14217859b8188793c81bcd68455cc986f58ec34"} Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.829338 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969284 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969323 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969502 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.969578 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdgrb\" (UniqueName: \"kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb\") pod \"ef911263-adb5-4a18-b726-888ec33cb66a\" (UID: \"ef911263-adb5-4a18-b726-888ec33cb66a\") " Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.984148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts" (OuterVolumeSpecName: "scripts") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:29 crc kubenswrapper[4755]: I0317 00:50:29.986845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb" (OuterVolumeSpecName: "kube-api-access-rdgrb") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "kube-api-access-rdgrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.044955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.055576 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.072518 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdgrb\" (UniqueName: \"kubernetes.io/projected/ef911263-adb5-4a18-b726-888ec33cb66a-kube-api-access-rdgrb\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.072575 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.072584 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.072594 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.092212 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data" (OuterVolumeSpecName: "config-data") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.109576 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef911263-adb5-4a18-b726-888ec33cb66a" (UID: "ef911263-adb5-4a18-b726-888ec33cb66a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.174392 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.174716 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef911263-adb5-4a18-b726-888ec33cb66a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.585930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ef911263-adb5-4a18-b726-888ec33cb66a","Type":"ContainerDied","Data":"8e5a09a16798d85166c4499888e6ae5b45022062a7c8b4a0fd1cc97552c6e408"} Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.586024 4755 scope.go:117] "RemoveContainer" containerID="c30036fbb6c41d2c7dada4fcf02401c9a4dffe0167f192a9cce1907ad319ef2a" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.586038 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.621082 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.629376 4755 scope.go:117] "RemoveContainer" containerID="0a5c23131230a386b5676ad6a14217859b8188793c81bcd68455cc986f58ec34" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.638515 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.655617 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:30 crc kubenswrapper[4755]: E0317 00:50:30.656251 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-api" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656277 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-api" Mar 17 00:50:30 crc kubenswrapper[4755]: E0317 00:50:30.656313 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-listener" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656326 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-listener" Mar 17 00:50:30 crc kubenswrapper[4755]: E0317 00:50:30.656348 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-notifier" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656361 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-notifier" Mar 17 00:50:30 crc kubenswrapper[4755]: E0317 00:50:30.656385 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-evaluator" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656394 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-evaluator" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656702 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-api" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656748 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-evaluator" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656770 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-notifier" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.656796 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" containerName="aodh-listener" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.659313 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.664592 4755 scope.go:117] "RemoveContainer" containerID="c47c6f774e1ffc33a220736fe324ab9a87ea0df635326870d923007284d26426" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.664742 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.692681 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.693753 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5g9jb" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.693912 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.694106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.694226 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.715705 4755 scope.go:117] "RemoveContainer" containerID="70bddc077d3511cc9064594aef0f7e8a253e1aa1e2a8106a77471972eeea8e12" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgtj\" (UniqueName: \"kubernetes.io/projected/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-kube-api-access-ftgtj\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-scripts\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-public-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792503 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-config-data\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.792580 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-internal-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-config-data\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894659 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-internal-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgtj\" (UniqueName: \"kubernetes.io/projected/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-kube-api-access-ftgtj\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-scripts\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.894810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-public-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.898262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-internal-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.898421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.899352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-scripts\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.899321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-config-data\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.911035 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-public-tls-certs\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:30 crc kubenswrapper[4755]: I0317 00:50:30.913129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgtj\" (UniqueName: \"kubernetes.io/projected/60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6-kube-api-access-ftgtj\") pod \"aodh-0\" (UID: \"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6\") " pod="openstack/aodh-0" Mar 17 00:50:31 crc kubenswrapper[4755]: I0317 00:50:31.005974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 17 00:50:31 crc kubenswrapper[4755]: I0317 00:50:31.497258 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 17 00:50:31 crc kubenswrapper[4755]: I0317 00:50:31.599531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6","Type":"ContainerStarted","Data":"f306c83f5c1c672d735ba333f3d3aab6b9d4ba2b3b6a9d9a602ad6cd821306a1"} Mar 17 00:50:32 crc kubenswrapper[4755]: I0317 00:50:32.264262 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef911263-adb5-4a18-b726-888ec33cb66a" path="/var/lib/kubelet/pods/ef911263-adb5-4a18-b726-888ec33cb66a/volumes" Mar 17 00:50:32 crc kubenswrapper[4755]: I0317 00:50:32.610768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6","Type":"ContainerStarted","Data":"1a3b6603320e3a2ba8dee310e49cf2bf27cc3c3c4247c4a9c0338d462474a6bd"} Mar 17 00:50:33 crc kubenswrapper[4755]: I0317 00:50:33.249392 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:50:33 crc kubenswrapper[4755]: E0317 00:50:33.250179 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:50:33 crc kubenswrapper[4755]: I0317 00:50:33.624379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6","Type":"ContainerStarted","Data":"d969b957802709e98b3b3fa9dfc0c4cb9b18f419d1a584c9e98e0820bf9740cc"} Mar 17 00:50:34 crc kubenswrapper[4755]: I0317 00:50:34.640220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6","Type":"ContainerStarted","Data":"72f083bbb74dca5f7ea21c81456c8adf8a9b1f94287722808fdf5080ec703dbc"} Mar 17 00:50:35 crc kubenswrapper[4755]: I0317 00:50:35.653252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6","Type":"ContainerStarted","Data":"474447b1cf72c0c31b705c7f87fb277333891905bfb658ce99a79009b8982f02"} Mar 17 00:50:35 crc kubenswrapper[4755]: I0317 00:50:35.688137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.056255719 podStartE2EDuration="5.688115198s" podCreationTimestamp="2026-03-17 00:50:30 +0000 UTC" firstStartedPulling="2026-03-17 00:50:31.491938165 +0000 UTC m=+1706.251390448" lastFinishedPulling="2026-03-17 00:50:35.123797624 +0000 UTC m=+1709.883249927" observedRunningTime="2026-03-17 00:50:35.674808075 +0000 UTC m=+1710.434260378" watchObservedRunningTime="2026-03-17 00:50:35.688115198 +0000 UTC m=+1710.447567491" Mar 17 00:50:48 crc kubenswrapper[4755]: I0317 00:50:48.248631 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:50:48 crc kubenswrapper[4755]: E0317 00:50:48.249799 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:50:55 crc kubenswrapper[4755]: I0317 00:50:55.408492 4755 scope.go:117] "RemoveContainer" containerID="eb9c88561ae24687302fdb4089df66605538ffbc96a8570b9ac9bd2667fec2ee" Mar 17 00:50:55 crc kubenswrapper[4755]: I0317 00:50:55.450181 4755 scope.go:117] "RemoveContainer" containerID="ae14e1774c8730637bf84f356775ffbb7884d44f92f657ceed5908d174039c72" Mar 17 00:50:55 crc kubenswrapper[4755]: I0317 00:50:55.499416 4755 scope.go:117] "RemoveContainer" containerID="4a4d975184340c035241f9956f12740eafe39b9e26831f9a1fc826c3cfe177e4" Mar 17 00:50:55 crc kubenswrapper[4755]: I0317 00:50:55.543495 4755 scope.go:117] "RemoveContainer" containerID="edf8e8d35128c3c0f643bc4f209edab536c9a45653b99c0ad5aa889e46b99df0" Mar 17 00:51:02 crc kubenswrapper[4755]: I0317 00:51:02.248821 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:51:02 crc kubenswrapper[4755]: E0317 00:51:02.249524 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:51:16 crc kubenswrapper[4755]: I0317 00:51:16.263083 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:51:16 crc kubenswrapper[4755]: E0317 00:51:16.264045 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:51:27 crc kubenswrapper[4755]: I0317 00:51:27.416257 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" podUID="cfa93106-8e0c-4e7d-93cf-33d06c85d883" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 17 00:51:31 crc kubenswrapper[4755]: I0317 00:51:31.249275 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:51:31 crc kubenswrapper[4755]: E0317 00:51:31.250233 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:51:45 crc kubenswrapper[4755]: I0317 00:51:45.248521 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:51:45 crc kubenswrapper[4755]: E0317 00:51:45.249389 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:51:55 crc kubenswrapper[4755]: I0317 00:51:55.795973 4755 scope.go:117] "RemoveContainer" containerID="b776fe73d3b89f4bf8c97dbf6f45d851c37770fd56afbb03d8cca66eaa9c98c7" Mar 17 00:51:55 crc kubenswrapper[4755]: I0317 00:51:55.850942 4755 scope.go:117] "RemoveContainer" containerID="8c3d05915cb3297973724d78022ac4a3f5a664a2aa08273ed0be14ccdf79d710" Mar 17 00:51:55 crc kubenswrapper[4755]: I0317 00:51:55.882817 4755 scope.go:117] "RemoveContainer" containerID="4a9841768d1c6c113d14885c2406467446cfa67fad227c72bd7b8429c681a5a0" Mar 17 00:51:55 crc kubenswrapper[4755]: I0317 00:51:55.958705 4755 scope.go:117] "RemoveContainer" containerID="74f0fc9e6a3128782b59a3edd43d8e081bcc3e5e9b9505e3efc8d69d114df5a5" Mar 17 00:51:55 crc kubenswrapper[4755]: I0317 00:51:55.982716 4755 scope.go:117] "RemoveContainer" containerID="3264ac7df8ac7f589b0f5ce9b34ff8e33cc1581c4a5ec895969c1227763c0a32" Mar 17 00:51:56 crc kubenswrapper[4755]: I0317 00:51:56.009364 4755 scope.go:117] "RemoveContainer" containerID="f0631b380191fd12a56a5aafc1976c097929edd1d24bc1fda940c77bb0b04afe" Mar 17 00:51:56 crc kubenswrapper[4755]: I0317 00:51:56.050760 4755 scope.go:117] "RemoveContainer" containerID="a6a20c3ca1008f78e35d0fdc421742a892627f6b4b6963dcfbe9e6abbefbd8cf" Mar 17 00:51:56 crc kubenswrapper[4755]: I0317 00:51:56.256344 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:51:56 crc kubenswrapper[4755]: E0317 00:51:56.257004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.160505 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561812-fx979"] Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.163021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.180090 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561812-fx979"] Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.214236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bwg6\" (UniqueName: \"kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6\") pod \"auto-csr-approver-29561812-fx979\" (UID: \"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3\") " pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.224888 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.225356 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.225475 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.317481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bwg6\" (UniqueName: \"kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6\") pod \"auto-csr-approver-29561812-fx979\" (UID: \"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3\") " pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.341151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bwg6\" (UniqueName: \"kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6\") pod \"auto-csr-approver-29561812-fx979\" (UID: \"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3\") " pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:00 crc kubenswrapper[4755]: I0317 00:52:00.555356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:01 crc kubenswrapper[4755]: I0317 00:52:01.088091 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561812-fx979"] Mar 17 00:52:02 crc kubenswrapper[4755]: I0317 00:52:02.037273 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561812-fx979" event={"ID":"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3","Type":"ContainerStarted","Data":"1af8494fd5c8ffd1d4feb8bcfdd43cda9cc2880609f051ac61b332345f5d1316"} Mar 17 00:52:03 crc kubenswrapper[4755]: I0317 00:52:03.052082 4755 generic.go:334] "Generic (PLEG): container finished" podID="5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" containerID="8e88ed50bc220ab48095cae2a293f1c6ecf6d643fe25dc678eac66a657411f4e" exitCode=0 Mar 17 00:52:03 crc kubenswrapper[4755]: I0317 00:52:03.052251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561812-fx979" event={"ID":"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3","Type":"ContainerDied","Data":"8e88ed50bc220ab48095cae2a293f1c6ecf6d643fe25dc678eac66a657411f4e"} Mar 17 00:52:04 crc kubenswrapper[4755]: I0317 00:52:04.530062 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:04 crc kubenswrapper[4755]: I0317 00:52:04.627735 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bwg6\" (UniqueName: \"kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6\") pod \"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3\" (UID: \"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3\") " Mar 17 00:52:04 crc kubenswrapper[4755]: I0317 00:52:04.638789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6" (OuterVolumeSpecName: "kube-api-access-5bwg6") pod "5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" (UID: "5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3"). InnerVolumeSpecName "kube-api-access-5bwg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:52:04 crc kubenswrapper[4755]: I0317 00:52:04.730267 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bwg6\" (UniqueName: \"kubernetes.io/projected/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3-kube-api-access-5bwg6\") on node \"crc\" DevicePath \"\"" Mar 17 00:52:05 crc kubenswrapper[4755]: I0317 00:52:05.085101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561812-fx979" event={"ID":"5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3","Type":"ContainerDied","Data":"1af8494fd5c8ffd1d4feb8bcfdd43cda9cc2880609f051ac61b332345f5d1316"} Mar 17 00:52:05 crc kubenswrapper[4755]: I0317 00:52:05.085180 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561812-fx979" Mar 17 00:52:05 crc kubenswrapper[4755]: I0317 00:52:05.085480 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af8494fd5c8ffd1d4feb8bcfdd43cda9cc2880609f051ac61b332345f5d1316" Mar 17 00:52:05 crc kubenswrapper[4755]: I0317 00:52:05.631229 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561806-v78jd"] Mar 17 00:52:05 crc kubenswrapper[4755]: I0317 00:52:05.649898 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561806-v78jd"] Mar 17 00:52:06 crc kubenswrapper[4755]: I0317 00:52:06.270300 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4650d5-3d6d-4437-9b7b-f585de970b8f" path="/var/lib/kubelet/pods/7f4650d5-3d6d-4437-9b7b-f585de970b8f/volumes" Mar 17 00:52:11 crc kubenswrapper[4755]: I0317 00:52:11.249588 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:52:11 crc kubenswrapper[4755]: E0317 00:52:11.250595 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:52:24 crc kubenswrapper[4755]: I0317 00:52:24.248132 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:52:24 crc kubenswrapper[4755]: E0317 00:52:24.249826 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:52:36 crc kubenswrapper[4755]: I0317 00:52:36.263913 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:52:36 crc kubenswrapper[4755]: E0317 00:52:36.265334 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:52:51 crc kubenswrapper[4755]: I0317 00:52:51.249242 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:52:51 crc kubenswrapper[4755]: E0317 00:52:51.250495 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.213197 4755 scope.go:117] "RemoveContainer" containerID="7e826332cbade48c3ab2ce25963b6b5a1fcbc88a4b180b4d18347b21e7c7470f" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.240711 4755 scope.go:117] "RemoveContainer" containerID="16c06b27ecad5ac2820ea4e275e020668e8365709c4d65eb8d6f32a69df45692" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.268673 4755 scope.go:117] "RemoveContainer" containerID="ba22763049125683ba70ae7ef7ea3db4980fbebcaf6172268c62d7c15428a34b" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.292568 4755 scope.go:117] "RemoveContainer" containerID="40b22a1015e98904677fbabe7e3e16b1a5b5c36f7445375b29470cb8aae12ffa" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.331900 4755 scope.go:117] "RemoveContainer" containerID="d21d7b76a7dfecf9494c4daf129fe0efe105351497e68db989487e56779ff6ac" Mar 17 00:52:56 crc kubenswrapper[4755]: I0317 00:52:56.396567 4755 scope.go:117] "RemoveContainer" containerID="c51a8662f7f952dd2bd18d497e3f4cf596be3c90d7089354b558373238542cc9" Mar 17 00:53:03 crc kubenswrapper[4755]: I0317 00:53:03.248695 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:53:03 crc kubenswrapper[4755]: E0317 00:53:03.249800 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:53:15 crc kubenswrapper[4755]: I0317 00:53:15.248824 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:53:15 crc kubenswrapper[4755]: E0317 00:53:15.249919 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:53:28 crc kubenswrapper[4755]: I0317 00:53:28.249131 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:53:28 crc kubenswrapper[4755]: E0317 00:53:28.250214 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:53:31 crc kubenswrapper[4755]: I0317 00:53:31.284103 4755 generic.go:334] "Generic (PLEG): container finished" podID="93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" containerID="41f7a57d57a0d6adecf75db40817a16821640e3614f4d1b0e21ad7474962c0c0" exitCode=0 Mar 17 00:53:31 crc kubenswrapper[4755]: I0317 00:53:31.284224 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" event={"ID":"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9","Type":"ContainerDied","Data":"41f7a57d57a0d6adecf75db40817a16821640e3614f4d1b0e21ad7474962c0c0"} Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.849735 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.893015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle\") pod \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.893280 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcw85\" (UniqueName: \"kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85\") pod \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.893334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam\") pod \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.893371 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory\") pod \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\" (UID: \"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9\") " Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.904061 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85" (OuterVolumeSpecName: "kube-api-access-xcw85") pod "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" (UID: "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9"). InnerVolumeSpecName "kube-api-access-xcw85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.908564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" (UID: "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.937507 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory" (OuterVolumeSpecName: "inventory") pod "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" (UID: "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.944769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" (UID: "93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.996486 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcw85\" (UniqueName: \"kubernetes.io/projected/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-kube-api-access-xcw85\") on node \"crc\" DevicePath \"\"" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.996523 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.996539 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:53:32 crc kubenswrapper[4755]: I0317 00:53:32.996553 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.311553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" event={"ID":"93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9","Type":"ContainerDied","Data":"bacad24bad05d9c6dfc13a26a982d52776ce4435738bc2fd8f26e5f5d5bae8f9"} Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.311614 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bacad24bad05d9c6dfc13a26a982d52776ce4435738bc2fd8f26e5f5d5bae8f9" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.311705 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.434641 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz"] Mar 17 00:53:33 crc kubenswrapper[4755]: E0317 00:53:33.435256 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.435280 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 00:53:33 crc kubenswrapper[4755]: E0317 00:53:33.435299 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" containerName="oc" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.435309 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" containerName="oc" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.435596 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.435629 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" containerName="oc" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.436453 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.441026 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.441615 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.441782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.441932 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.448237 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz"] Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.507517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.507636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqxs\" (UniqueName: \"kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.507797 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.609140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqxs\" (UniqueName: \"kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.609328 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.609420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.612975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.617979 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.627259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqxs\" (UniqueName: \"kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49bxz\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:33 crc kubenswrapper[4755]: I0317 00:53:33.774175 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:53:34 crc kubenswrapper[4755]: I0317 00:53:34.319246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz"] Mar 17 00:53:35 crc kubenswrapper[4755]: I0317 00:53:35.350120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" event={"ID":"4e40f739-2496-4cd1-9d10-ecc61d250a1f","Type":"ContainerStarted","Data":"d782e16dd6d27e698e8123e6f52a7702660886aaec567ba7f16b095fc6615008"} Mar 17 00:53:35 crc kubenswrapper[4755]: I0317 00:53:35.351191 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" event={"ID":"4e40f739-2496-4cd1-9d10-ecc61d250a1f","Type":"ContainerStarted","Data":"3a361d60af84f8850f30362cb070ad6ae3028fb2a4afa15e406537b42d76ffc5"} Mar 17 00:53:35 crc kubenswrapper[4755]: I0317 00:53:35.373412 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" podStartSLOduration=1.8360452390000002 podStartE2EDuration="2.373391875s" podCreationTimestamp="2026-03-17 00:53:33 +0000 UTC" firstStartedPulling="2026-03-17 00:53:34.324426422 +0000 UTC m=+1889.083878745" lastFinishedPulling="2026-03-17 00:53:34.861773058 +0000 UTC m=+1889.621225381" observedRunningTime="2026-03-17 00:53:35.365982733 +0000 UTC m=+1890.125435036" watchObservedRunningTime="2026-03-17 00:53:35.373391875 +0000 UTC m=+1890.132844158" Mar 17 00:53:38 crc kubenswrapper[4755]: I0317 00:53:38.048478 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-374e-account-create-update-kv6bv"] Mar 17 00:53:38 crc kubenswrapper[4755]: I0317 00:53:38.061877 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-374e-account-create-update-kv6bv"] Mar 17 00:53:38 crc kubenswrapper[4755]: I0317 00:53:38.261903 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d32524-c98c-4d9e-abbf-3231c1b18e44" path="/var/lib/kubelet/pods/72d32524-c98c-4d9e-abbf-3231c1b18e44/volumes" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.062869 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-nc5wg"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.076963 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9vkmp"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.093338 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ckpfn"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.103258 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-nc5wg"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.113113 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f0e-account-create-update-pfs54"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.124229 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jmh2r"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.136871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ckpfn"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.148223 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9vkmp"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.159656 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5f0e-account-create-update-pfs54"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.171366 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jmh2r"] Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.248412 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:53:40 crc kubenswrapper[4755]: E0317 00:53:40.248678 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.258499 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22aa45d9-d0a3-4dad-98a9-293f6c396229" path="/var/lib/kubelet/pods/22aa45d9-d0a3-4dad-98a9-293f6c396229/volumes" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.259047 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554c2862-dfb9-4910-9d14-3fed242964ed" path="/var/lib/kubelet/pods/554c2862-dfb9-4910-9d14-3fed242964ed/volumes" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.259582 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b1d827-2b15-4213-b59a-39e3ac08b962" path="/var/lib/kubelet/pods/98b1d827-2b15-4213-b59a-39e3ac08b962/volumes" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.260075 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e769064f-9469-4183-bd71-52ed11230e0e" path="/var/lib/kubelet/pods/e769064f-9469-4183-bd71-52ed11230e0e/volumes" Mar 17 00:53:40 crc kubenswrapper[4755]: I0317 00:53:40.261086 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec87d4bf-d241-4b94-b3f7-0f006e4ceb87" path="/var/lib/kubelet/pods/ec87d4bf-d241-4b94-b3f7-0f006e4ceb87/volumes" Mar 17 00:53:41 crc kubenswrapper[4755]: I0317 00:53:41.051317 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e09f-account-create-update-ccmws"] Mar 17 00:53:41 crc kubenswrapper[4755]: I0317 00:53:41.067148 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-552a-account-create-update-ljkfk"] Mar 17 00:53:41 crc kubenswrapper[4755]: I0317 00:53:41.080971 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-552a-account-create-update-ljkfk"] Mar 17 00:53:41 crc kubenswrapper[4755]: I0317 00:53:41.091417 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e09f-account-create-update-ccmws"] Mar 17 00:53:42 crc kubenswrapper[4755]: I0317 00:53:42.263020 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efb20b9-cf6f-4c8e-9afc-92d6713630f2" path="/var/lib/kubelet/pods/5efb20b9-cf6f-4c8e-9afc-92d6713630f2/volumes" Mar 17 00:53:42 crc kubenswrapper[4755]: I0317 00:53:42.264170 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9964cd2d-4d04-4954-8ba1-0379d75a932f" path="/var/lib/kubelet/pods/9964cd2d-4d04-4954-8ba1-0379d75a932f/volumes" Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.049139 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-07d1-account-create-update-bghhg"] Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.068934 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9"] Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.086826 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-hfnb9"] Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.099363 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-07d1-account-create-update-bghhg"] Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.288691 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1636405-ed65-4deb-81e5-843ae69311f4" path="/var/lib/kubelet/pods/b1636405-ed65-4deb-81e5-843ae69311f4/volumes" Mar 17 00:53:50 crc kubenswrapper[4755]: I0317 00:53:50.293129 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae42150-0ad5-40f3-8ede-bd064e8284dc" path="/var/lib/kubelet/pods/cae42150-0ad5-40f3-8ede-bd064e8284dc/volumes" Mar 17 00:53:54 crc kubenswrapper[4755]: I0317 00:53:54.249809 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:53:54 crc kubenswrapper[4755]: E0317 00:53:54.250646 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.498201 4755 scope.go:117] "RemoveContainer" containerID="447451244cf67b8df101bd402f87b44e3b32be6df9e8fb35f9cf4361b50e7aec" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.557166 4755 scope.go:117] "RemoveContainer" containerID="ca6211fd25c7f6916638a97f101388596e4398c0641395f83d31cede23df832b" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.621538 4755 scope.go:117] "RemoveContainer" containerID="210c2f9b17f445ac8e69afc50d3b3edcc23f2f340b5905a0693cc2402895429d" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.681204 4755 scope.go:117] "RemoveContainer" containerID="e1894619324db4479d7e5db33bb1c8567bf7ef010167b45fd137b5f1b7392cd9" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.737847 4755 scope.go:117] "RemoveContainer" containerID="6d2f54b6949067d32222d9ad3ad0482fde08dc5f6ce7705b3555da19ad58f3bd" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.789915 4755 scope.go:117] "RemoveContainer" containerID="744e7e2542f4159e1f0372413e155f63f4c426dd6d8f012bf4e573d0d8b972a9" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.814755 4755 scope.go:117] "RemoveContainer" containerID="dbc2372936211dd27bc5d87c1ad5c8d83bd63bca9fb57ba2f17291d9b27b04f5" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.861962 4755 scope.go:117] "RemoveContainer" containerID="c2711f1830278f18151dcf972b04eff00d4147e39fb8fc88ab04045e1b4f3e53" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.887036 4755 scope.go:117] "RemoveContainer" containerID="6b581c8d4169dd453c0fc861dae478b6bd95bcb61d9efc606ed88f70587aeb3c" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.904451 4755 scope.go:117] "RemoveContainer" containerID="3a30bad658ae849d03ce5cd0f2be34ef0661e0b1e2c81fff2437ef2eaa585e4b" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.925840 4755 scope.go:117] "RemoveContainer" containerID="b182061aca0aa0d8b49a809739b0906c20bb7461a84b8cb997cdb448123583d5" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.949931 4755 scope.go:117] "RemoveContainer" containerID="1515bcf36bcff06f3fff0db52c2d153113ff3217ef838de25508ab97cf56f9e0" Mar 17 00:53:56 crc kubenswrapper[4755]: I0317 00:53:56.980863 4755 scope.go:117] "RemoveContainer" containerID="89ff4548117393a9f67860b9151f3087f5dff8b28c23b018aa6cdfc267530f05" Mar 17 00:53:57 crc kubenswrapper[4755]: I0317 00:53:57.007182 4755 scope.go:117] "RemoveContainer" containerID="c6565263fbbcb465c616c5d6ae268234c0d18461a554fa7f0c3360d656b9b080" Mar 17 00:53:57 crc kubenswrapper[4755]: I0317 00:53:57.037530 4755 scope.go:117] "RemoveContainer" containerID="2ced09580f45a693b419eeafb685780d8f4153cc4b84ea27eef328853d9ee1f0" Mar 17 00:53:57 crc kubenswrapper[4755]: I0317 00:53:57.071318 4755 scope.go:117] "RemoveContainer" containerID="6201013dccb2192de443a73d8b09da2cee86976b26dbd98901fbab2bf95b5729" Mar 17 00:53:57 crc kubenswrapper[4755]: I0317 00:53:57.110837 4755 scope.go:117] "RemoveContainer" containerID="4c519ab6fc149df41954fce5d2a54b0b893ca88bb38c491148dc4eb0f6d8b0e6" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.192790 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561814-v79pm"] Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.194837 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.196688 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.197541 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.203025 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561814-v79pm"] Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.230606 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.279452 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkp7z\" (UniqueName: \"kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z\") pod \"auto-csr-approver-29561814-v79pm\" (UID: \"9477c448-33bb-4e17-8f19-098bc4e134e4\") " pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.381672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkp7z\" (UniqueName: \"kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z\") pod \"auto-csr-approver-29561814-v79pm\" (UID: \"9477c448-33bb-4e17-8f19-098bc4e134e4\") " pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.412829 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkp7z\" (UniqueName: \"kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z\") pod \"auto-csr-approver-29561814-v79pm\" (UID: \"9477c448-33bb-4e17-8f19-098bc4e134e4\") " pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:00 crc kubenswrapper[4755]: I0317 00:54:00.555151 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:01 crc kubenswrapper[4755]: I0317 00:54:01.055766 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561814-v79pm"] Mar 17 00:54:01 crc kubenswrapper[4755]: I0317 00:54:01.720907 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561814-v79pm" event={"ID":"9477c448-33bb-4e17-8f19-098bc4e134e4","Type":"ContainerStarted","Data":"7534972ddbebf7ad900b0843e6b8aacb02ae67174943adf8c95db56d39352e43"} Mar 17 00:54:03 crc kubenswrapper[4755]: I0317 00:54:03.751292 4755 generic.go:334] "Generic (PLEG): container finished" podID="9477c448-33bb-4e17-8f19-098bc4e134e4" containerID="d71b4e9571c10e4c34ea754155f6f4cc99d55b9dfd5691b348f19d0356ad3102" exitCode=0 Mar 17 00:54:03 crc kubenswrapper[4755]: I0317 00:54:03.751404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561814-v79pm" event={"ID":"9477c448-33bb-4e17-8f19-098bc4e134e4","Type":"ContainerDied","Data":"d71b4e9571c10e4c34ea754155f6f4cc99d55b9dfd5691b348f19d0356ad3102"} Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.205039 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.249049 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:54:05 crc kubenswrapper[4755]: E0317 00:54:05.249265 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.297189 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkp7z\" (UniqueName: \"kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z\") pod \"9477c448-33bb-4e17-8f19-098bc4e134e4\" (UID: \"9477c448-33bb-4e17-8f19-098bc4e134e4\") " Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.303275 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z" (OuterVolumeSpecName: "kube-api-access-zkp7z") pod "9477c448-33bb-4e17-8f19-098bc4e134e4" (UID: "9477c448-33bb-4e17-8f19-098bc4e134e4"). InnerVolumeSpecName "kube-api-access-zkp7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.400134 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkp7z\" (UniqueName: \"kubernetes.io/projected/9477c448-33bb-4e17-8f19-098bc4e134e4-kube-api-access-zkp7z\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.771970 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561814-v79pm" event={"ID":"9477c448-33bb-4e17-8f19-098bc4e134e4","Type":"ContainerDied","Data":"7534972ddbebf7ad900b0843e6b8aacb02ae67174943adf8c95db56d39352e43"} Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.772023 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7534972ddbebf7ad900b0843e6b8aacb02ae67174943adf8c95db56d39352e43" Mar 17 00:54:05 crc kubenswrapper[4755]: I0317 00:54:05.772023 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561814-v79pm" Mar 17 00:54:06 crc kubenswrapper[4755]: I0317 00:54:06.288485 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561808-s48qq"] Mar 17 00:54:06 crc kubenswrapper[4755]: I0317 00:54:06.303241 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561808-s48qq"] Mar 17 00:54:08 crc kubenswrapper[4755]: I0317 00:54:08.267086 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12958939-163c-488e-9297-548add9a591b" path="/var/lib/kubelet/pods/12958939-163c-488e-9297-548add9a591b/volumes" Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.082331 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d1ca-account-create-update-wkcbb"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.109637 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-28cfm"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.124304 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e661-account-create-update-kxqpp"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.134668 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-28cfm"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.150817 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d1ca-account-create-update-wkcbb"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.166414 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e661-account-create-update-kxqpp"] Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.262786 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5673ab-42bb-4268-a723-b2df9c13904b" path="/var/lib/kubelet/pods/2a5673ab-42bb-4268-a723-b2df9c13904b/volumes" Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.263507 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722c192f-3110-4799-a25c-def078351bbc" path="/var/lib/kubelet/pods/722c192f-3110-4799-a25c-def078351bbc/volumes" Mar 17 00:54:12 crc kubenswrapper[4755]: I0317 00:54:12.264247 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f02541-9d51-424d-b558-15bb417ad5b2" path="/var/lib/kubelet/pods/e5f02541-9d51-424d-b558-15bb417ad5b2/volumes" Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.047018 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nn54s"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.063536 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-w7h7z"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.078539 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nn54s"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.096481 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qsd9p"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.115372 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-35d1-account-create-update-tbps2"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.131483 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4234-account-create-update-czqrb"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.141471 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-w7h7z"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.147528 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-35d1-account-create-update-tbps2"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.156056 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qsd9p"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.165484 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4234-account-create-update-czqrb"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.174501 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-znbjn"] Mar 17 00:54:13 crc kubenswrapper[4755]: I0317 00:54:13.183004 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-znbjn"] Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.269224 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12fba3ab-a03b-40ab-8ed5-ce2b667003da" path="/var/lib/kubelet/pods/12fba3ab-a03b-40ab-8ed5-ce2b667003da/volumes" Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.270981 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa0451d-3211-4eb1-86ab-ca7a573632fc" path="/var/lib/kubelet/pods/1aa0451d-3211-4eb1-86ab-ca7a573632fc/volumes" Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.272136 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48efdbc2-4211-40f2-8f38-c7e2199852ba" path="/var/lib/kubelet/pods/48efdbc2-4211-40f2-8f38-c7e2199852ba/volumes" Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.273620 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a46d626-29c2-42a4-88a0-e01284c086fa" path="/var/lib/kubelet/pods/7a46d626-29c2-42a4-88a0-e01284c086fa/volumes" Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.275964 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab11e25f-07a9-431a-bdbd-bafb6d673e5c" path="/var/lib/kubelet/pods/ab11e25f-07a9-431a-bdbd-bafb6d673e5c/volumes" Mar 17 00:54:14 crc kubenswrapper[4755]: I0317 00:54:14.277516 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10d407a-c50c-4f3e-955a-92b2f75d2fd6" path="/var/lib/kubelet/pods/f10d407a-c50c-4f3e-955a-92b2f75d2fd6/volumes" Mar 17 00:54:15 crc kubenswrapper[4755]: I0317 00:54:15.042965 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vjbjp"] Mar 17 00:54:15 crc kubenswrapper[4755]: I0317 00:54:15.054060 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vjbjp"] Mar 17 00:54:16 crc kubenswrapper[4755]: I0317 00:54:16.272563 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0991527c-bb4b-498c-86b3-d303cee4eeb1" path="/var/lib/kubelet/pods/0991527c-bb4b-498c-86b3-d303cee4eeb1/volumes" Mar 17 00:54:17 crc kubenswrapper[4755]: I0317 00:54:17.032206 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hnc9f"] Mar 17 00:54:17 crc kubenswrapper[4755]: I0317 00:54:17.040241 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hnc9f"] Mar 17 00:54:18 crc kubenswrapper[4755]: I0317 00:54:18.284291 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2dc7f6-f91c-4e3c-a360-a464608fd8ca" path="/var/lib/kubelet/pods/ca2dc7f6-f91c-4e3c-a360-a464608fd8ca/volumes" Mar 17 00:54:20 crc kubenswrapper[4755]: I0317 00:54:20.248149 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:54:20 crc kubenswrapper[4755]: E0317 00:54:20.248738 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:54:34 crc kubenswrapper[4755]: I0317 00:54:34.249232 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:54:34 crc kubenswrapper[4755]: E0317 00:54:34.249975 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:54:44 crc kubenswrapper[4755]: I0317 00:54:44.257054 4755 generic.go:334] "Generic (PLEG): container finished" podID="4e40f739-2496-4cd1-9d10-ecc61d250a1f" containerID="d782e16dd6d27e698e8123e6f52a7702660886aaec567ba7f16b095fc6615008" exitCode=0 Mar 17 00:54:44 crc kubenswrapper[4755]: I0317 00:54:44.260363 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" event={"ID":"4e40f739-2496-4cd1-9d10-ecc61d250a1f","Type":"ContainerDied","Data":"d782e16dd6d27e698e8123e6f52a7702660886aaec567ba7f16b095fc6615008"} Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.760784 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.862007 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam\") pod \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.862219 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory\") pod \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.862258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhqxs\" (UniqueName: \"kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs\") pod \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\" (UID: \"4e40f739-2496-4cd1-9d10-ecc61d250a1f\") " Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.869575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs" (OuterVolumeSpecName: "kube-api-access-vhqxs") pod "4e40f739-2496-4cd1-9d10-ecc61d250a1f" (UID: "4e40f739-2496-4cd1-9d10-ecc61d250a1f"). InnerVolumeSpecName "kube-api-access-vhqxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.899375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4e40f739-2496-4cd1-9d10-ecc61d250a1f" (UID: "4e40f739-2496-4cd1-9d10-ecc61d250a1f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.904923 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory" (OuterVolumeSpecName: "inventory") pod "4e40f739-2496-4cd1-9d10-ecc61d250a1f" (UID: "4e40f739-2496-4cd1-9d10-ecc61d250a1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.965090 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.965148 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e40f739-2496-4cd1-9d10-ecc61d250a1f-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:45 crc kubenswrapper[4755]: I0317 00:54:45.965166 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhqxs\" (UniqueName: \"kubernetes.io/projected/4e40f739-2496-4cd1-9d10-ecc61d250a1f-kube-api-access-vhqxs\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.287011 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" event={"ID":"4e40f739-2496-4cd1-9d10-ecc61d250a1f","Type":"ContainerDied","Data":"3a361d60af84f8850f30362cb070ad6ae3028fb2a4afa15e406537b42d76ffc5"} Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.287050 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a361d60af84f8850f30362cb070ad6ae3028fb2a4afa15e406537b42d76ffc5" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.287073 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.412159 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8"] Mar 17 00:54:46 crc kubenswrapper[4755]: E0317 00:54:46.412844 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e40f739-2496-4cd1-9d10-ecc61d250a1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.412865 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e40f739-2496-4cd1-9d10-ecc61d250a1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:46 crc kubenswrapper[4755]: E0317 00:54:46.412885 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9477c448-33bb-4e17-8f19-098bc4e134e4" containerName="oc" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.412893 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477c448-33bb-4e17-8f19-098bc4e134e4" containerName="oc" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.413167 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e40f739-2496-4cd1-9d10-ecc61d250a1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.413190 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9477c448-33bb-4e17-8f19-098bc4e134e4" containerName="oc" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.414131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.417377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.417608 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.417740 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.417847 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.422282 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8"] Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.477320 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5db8\" (UniqueName: \"kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.477400 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.477636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.579929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.580044 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5db8\" (UniqueName: \"kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.580075 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.583591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.587749 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.606566 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5db8\" (UniqueName: \"kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-npjh8\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:46 crc kubenswrapper[4755]: I0317 00:54:46.776006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:47 crc kubenswrapper[4755]: I0317 00:54:47.381986 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8"] Mar 17 00:54:47 crc kubenswrapper[4755]: W0317 00:54:47.392014 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88469a2b_b923_4a0c_ade2_fe4649316da7.slice/crio-810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68 WatchSource:0}: Error finding container 810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68: Status 404 returned error can't find the container with id 810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68 Mar 17 00:54:48 crc kubenswrapper[4755]: I0317 00:54:48.313233 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" event={"ID":"88469a2b-b923-4a0c-ade2-fe4649316da7","Type":"ContainerStarted","Data":"866e087d43d05d588012b8e6376af6b4d5ad3069951f70a635c041b007202924"} Mar 17 00:54:48 crc kubenswrapper[4755]: I0317 00:54:48.313908 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" event={"ID":"88469a2b-b923-4a0c-ade2-fe4649316da7","Type":"ContainerStarted","Data":"810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68"} Mar 17 00:54:48 crc kubenswrapper[4755]: I0317 00:54:48.348897 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" podStartSLOduration=1.791838776 podStartE2EDuration="2.348867298s" podCreationTimestamp="2026-03-17 00:54:46 +0000 UTC" firstStartedPulling="2026-03-17 00:54:47.394430756 +0000 UTC m=+1962.153883039" lastFinishedPulling="2026-03-17 00:54:47.951459248 +0000 UTC m=+1962.710911561" observedRunningTime="2026-03-17 00:54:48.331332081 +0000 UTC m=+1963.090784414" watchObservedRunningTime="2026-03-17 00:54:48.348867298 +0000 UTC m=+1963.108319621" Mar 17 00:54:49 crc kubenswrapper[4755]: I0317 00:54:49.247872 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:54:49 crc kubenswrapper[4755]: E0317 00:54:49.248637 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 00:54:50 crc kubenswrapper[4755]: I0317 00:54:50.052310 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4xd8r"] Mar 17 00:54:50 crc kubenswrapper[4755]: I0317 00:54:50.069541 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4xd8r"] Mar 17 00:54:50 crc kubenswrapper[4755]: I0317 00:54:50.263143 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bf6c90-0673-42ed-b463-d0510425117d" path="/var/lib/kubelet/pods/24bf6c90-0673-42ed-b463-d0510425117d/volumes" Mar 17 00:54:53 crc kubenswrapper[4755]: I0317 00:54:53.389131 4755 generic.go:334] "Generic (PLEG): container finished" podID="88469a2b-b923-4a0c-ade2-fe4649316da7" containerID="866e087d43d05d588012b8e6376af6b4d5ad3069951f70a635c041b007202924" exitCode=0 Mar 17 00:54:53 crc kubenswrapper[4755]: I0317 00:54:53.389220 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" event={"ID":"88469a2b-b923-4a0c-ade2-fe4649316da7","Type":"ContainerDied","Data":"866e087d43d05d588012b8e6376af6b4d5ad3069951f70a635c041b007202924"} Mar 17 00:54:54 crc kubenswrapper[4755]: I0317 00:54:54.973759 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.103405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam\") pod \"88469a2b-b923-4a0c-ade2-fe4649316da7\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.103827 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5db8\" (UniqueName: \"kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8\") pod \"88469a2b-b923-4a0c-ade2-fe4649316da7\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.103900 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory\") pod \"88469a2b-b923-4a0c-ade2-fe4649316da7\" (UID: \"88469a2b-b923-4a0c-ade2-fe4649316da7\") " Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.110515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8" (OuterVolumeSpecName: "kube-api-access-f5db8") pod "88469a2b-b923-4a0c-ade2-fe4649316da7" (UID: "88469a2b-b923-4a0c-ade2-fe4649316da7"). InnerVolumeSpecName "kube-api-access-f5db8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.137183 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "88469a2b-b923-4a0c-ade2-fe4649316da7" (UID: "88469a2b-b923-4a0c-ade2-fe4649316da7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.148297 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory" (OuterVolumeSpecName: "inventory") pod "88469a2b-b923-4a0c-ade2-fe4649316da7" (UID: "88469a2b-b923-4a0c-ade2-fe4649316da7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.208500 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5db8\" (UniqueName: \"kubernetes.io/projected/88469a2b-b923-4a0c-ade2-fe4649316da7-kube-api-access-f5db8\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.208536 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.208552 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88469a2b-b923-4a0c-ade2-fe4649316da7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.416159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" event={"ID":"88469a2b-b923-4a0c-ade2-fe4649316da7","Type":"ContainerDied","Data":"810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68"} Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.416210 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810d33aa770eb8585a125877b29a629008f376cd709d5aae02b3ac195c771a68" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.416237 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.623027 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l"] Mar 17 00:54:55 crc kubenswrapper[4755]: E0317 00:54:55.624267 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88469a2b-b923-4a0c-ade2-fe4649316da7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.624306 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="88469a2b-b923-4a0c-ade2-fe4649316da7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.624762 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="88469a2b-b923-4a0c-ade2-fe4649316da7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.626474 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.629269 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.631489 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.632383 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.632516 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l"] Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.634325 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.720501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.720794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.720860 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glfs\" (UniqueName: \"kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.823495 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.823682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.823734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glfs\" (UniqueName: \"kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.832180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.843046 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.854556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glfs\" (UniqueName: \"kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2sx2l\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:55 crc kubenswrapper[4755]: I0317 00:54:55.947712 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:54:56 crc kubenswrapper[4755]: I0317 00:54:56.585880 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l"] Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.439420 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" event={"ID":"6294dd29-7825-45f8-a9e2-f72f0b56ead8","Type":"ContainerStarted","Data":"e72c4be739b5cdaa2bc03a9f8148bdf0a0b9cb52906340109b9e1749c06beedc"} Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.464613 4755 scope.go:117] "RemoveContainer" containerID="48ac9b2cd5f688088f4bf4c0c38a31fb4e48e4d147b948f0e0c6967341c23560" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.507925 4755 scope.go:117] "RemoveContainer" containerID="8597b5a97df6787d62e4199a6a1b84371d448cef522ecec88bb44f1883acd6cd" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.577523 4755 scope.go:117] "RemoveContainer" containerID="b3e1540ba80b3dbf83910ead5a6a8ce7235bf895fd62eee767db77e469e91421" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.603587 4755 scope.go:117] "RemoveContainer" containerID="5c2ca20c1adbcd47f53c61b151002dd6fcfa14ce84372a1e1c2c02eb52aead30" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.636420 4755 scope.go:117] "RemoveContainer" containerID="d0d0576b3e1aeedb7930827df19c198f8002948195c4b3ee0d07e1908cab2095" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.665621 4755 scope.go:117] "RemoveContainer" containerID="c76851ae019f38a6ec96b8b057c29ae227371bd82e63fd9f695aa2e7be2291be" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.685216 4755 scope.go:117] "RemoveContainer" containerID="e841319fb482c895ce7406c93f46dab7d9a57025a6650e3cd6bc3cccf7dfa458" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.715275 4755 scope.go:117] "RemoveContainer" containerID="48e4f39385e61999e762d4bb7949f3dc1469a7a0344c13b46474f658f18e93eb" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.740047 4755 scope.go:117] "RemoveContainer" containerID="d6bb708814cc468246420939d8e13273cca9282f63af238e2fba0887d65641b9" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.777706 4755 scope.go:117] "RemoveContainer" containerID="a96f4d3c3591fb53aeeeee5aca7f4fb83c7d2a1d7084b45fc6a7a7e573d11d39" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.806788 4755 scope.go:117] "RemoveContainer" containerID="88d14a489b5e70cb05fb7d650deb22168ba06563c3275b85289e89aba141bd6a" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.830345 4755 scope.go:117] "RemoveContainer" containerID="96573cdd374881bd73ad187aad6bb082632929bb78e3a62ee3b389968af611e7" Mar 17 00:54:57 crc kubenswrapper[4755]: I0317 00:54:57.849808 4755 scope.go:117] "RemoveContainer" containerID="8d2decf981321ce435c9a278695392e690d26970478d9859a22c8e7bdb116afd" Mar 17 00:54:58 crc kubenswrapper[4755]: I0317 00:54:58.450308 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" event={"ID":"6294dd29-7825-45f8-a9e2-f72f0b56ead8","Type":"ContainerStarted","Data":"b0836e19b8d58cf1fe73b1c06ab90fdad4a0cd76c6939e5225388f26d90e6398"} Mar 17 00:54:58 crc kubenswrapper[4755]: I0317 00:54:58.468088 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" podStartSLOduration=2.8245248160000003 podStartE2EDuration="3.46806929s" podCreationTimestamp="2026-03-17 00:54:55 +0000 UTC" firstStartedPulling="2026-03-17 00:54:56.594198019 +0000 UTC m=+1971.353650302" lastFinishedPulling="2026-03-17 00:54:57.237742473 +0000 UTC m=+1971.997194776" observedRunningTime="2026-03-17 00:54:58.466800985 +0000 UTC m=+1973.226253268" watchObservedRunningTime="2026-03-17 00:54:58.46806929 +0000 UTC m=+1973.227521583" Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.062417 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6jk7m"] Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.075888 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xksp4"] Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.089776 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bdnb6"] Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.100986 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xksp4"] Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.111576 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6jk7m"] Mar 17 00:55:01 crc kubenswrapper[4755]: I0317 00:55:01.119814 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bdnb6"] Mar 17 00:55:02 crc kubenswrapper[4755]: I0317 00:55:02.266789 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb8013b-627b-4894-945e-178871516870" path="/var/lib/kubelet/pods/bbb8013b-627b-4894-945e-178871516870/volumes" Mar 17 00:55:02 crc kubenswrapper[4755]: I0317 00:55:02.268874 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b76ade-1f20-43e7-bca3-cc0c70a05d4f" path="/var/lib/kubelet/pods/c5b76ade-1f20-43e7-bca3-cc0c70a05d4f/volumes" Mar 17 00:55:02 crc kubenswrapper[4755]: I0317 00:55:02.270431 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc260978-d229-43c0-b836-d1bb5a308c48" path="/var/lib/kubelet/pods/cc260978-d229-43c0-b836-d1bb5a308c48/volumes" Mar 17 00:55:04 crc kubenswrapper[4755]: I0317 00:55:04.248962 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:55:04 crc kubenswrapper[4755]: I0317 00:55:04.527978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b"} Mar 17 00:55:14 crc kubenswrapper[4755]: I0317 00:55:14.053714 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-v7jsc"] Mar 17 00:55:14 crc kubenswrapper[4755]: I0317 00:55:14.067035 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-v7jsc"] Mar 17 00:55:14 crc kubenswrapper[4755]: I0317 00:55:14.265976 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="404d7b5a-9c59-4c63-b3be-740554b83374" path="/var/lib/kubelet/pods/404d7b5a-9c59-4c63-b3be-740554b83374/volumes" Mar 17 00:55:37 crc kubenswrapper[4755]: I0317 00:55:37.954925 4755 generic.go:334] "Generic (PLEG): container finished" podID="6294dd29-7825-45f8-a9e2-f72f0b56ead8" containerID="b0836e19b8d58cf1fe73b1c06ab90fdad4a0cd76c6939e5225388f26d90e6398" exitCode=0 Mar 17 00:55:37 crc kubenswrapper[4755]: I0317 00:55:37.955017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" event={"ID":"6294dd29-7825-45f8-a9e2-f72f0b56ead8","Type":"ContainerDied","Data":"b0836e19b8d58cf1fe73b1c06ab90fdad4a0cd76c6939e5225388f26d90e6398"} Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.442209 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.542489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam\") pod \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.542582 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory\") pod \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.542625 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glfs\" (UniqueName: \"kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs\") pod \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\" (UID: \"6294dd29-7825-45f8-a9e2-f72f0b56ead8\") " Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.547319 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs" (OuterVolumeSpecName: "kube-api-access-9glfs") pod "6294dd29-7825-45f8-a9e2-f72f0b56ead8" (UID: "6294dd29-7825-45f8-a9e2-f72f0b56ead8"). InnerVolumeSpecName "kube-api-access-9glfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.580857 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory" (OuterVolumeSpecName: "inventory") pod "6294dd29-7825-45f8-a9e2-f72f0b56ead8" (UID: "6294dd29-7825-45f8-a9e2-f72f0b56ead8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.595582 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6294dd29-7825-45f8-a9e2-f72f0b56ead8" (UID: "6294dd29-7825-45f8-a9e2-f72f0b56ead8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.645422 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.645564 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6294dd29-7825-45f8-a9e2-f72f0b56ead8-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.645588 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glfs\" (UniqueName: \"kubernetes.io/projected/6294dd29-7825-45f8-a9e2-f72f0b56ead8-kube-api-access-9glfs\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.976673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" event={"ID":"6294dd29-7825-45f8-a9e2-f72f0b56ead8","Type":"ContainerDied","Data":"e72c4be739b5cdaa2bc03a9f8148bdf0a0b9cb52906340109b9e1749c06beedc"} Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.977218 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e72c4be739b5cdaa2bc03a9f8148bdf0a0b9cb52906340109b9e1749c06beedc" Mar 17 00:55:39 crc kubenswrapper[4755]: I0317 00:55:39.977174 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.105247 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk"] Mar 17 00:55:40 crc kubenswrapper[4755]: E0317 00:55:40.105803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6294dd29-7825-45f8-a9e2-f72f0b56ead8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.105824 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6294dd29-7825-45f8-a9e2-f72f0b56ead8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.106115 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6294dd29-7825-45f8-a9e2-f72f0b56ead8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.107021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.109989 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.109989 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.112779 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.112796 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.124729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk"] Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.155064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.155116 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fsdn\" (UniqueName: \"kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.155580 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.257342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.257706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.257751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fsdn\" (UniqueName: \"kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.264056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.268718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.288161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fsdn\" (UniqueName: \"kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:40 crc kubenswrapper[4755]: I0317 00:55:40.442397 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:41 crc kubenswrapper[4755]: W0317 00:55:41.090713 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod633d8f49_d4ce_475d_841e_c5ca7261f61a.slice/crio-4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3 WatchSource:0}: Error finding container 4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3: Status 404 returned error can't find the container with id 4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3 Mar 17 00:55:41 crc kubenswrapper[4755]: I0317 00:55:41.095287 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 00:55:41 crc kubenswrapper[4755]: I0317 00:55:41.097168 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk"] Mar 17 00:55:42 crc kubenswrapper[4755]: I0317 00:55:42.011025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" event={"ID":"633d8f49-d4ce-475d-841e-c5ca7261f61a","Type":"ContainerStarted","Data":"b4a18d48b0a0f1a14180cceee5940b187e1df76f2cc4fe30702a630f9fd21c54"} Mar 17 00:55:42 crc kubenswrapper[4755]: I0317 00:55:42.011378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" event={"ID":"633d8f49-d4ce-475d-841e-c5ca7261f61a","Type":"ContainerStarted","Data":"4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3"} Mar 17 00:55:42 crc kubenswrapper[4755]: I0317 00:55:42.054675 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" podStartSLOduration=1.5838751 podStartE2EDuration="2.054644746s" podCreationTimestamp="2026-03-17 00:55:40 +0000 UTC" firstStartedPulling="2026-03-17 00:55:41.095021342 +0000 UTC m=+2015.854473635" lastFinishedPulling="2026-03-17 00:55:41.565790958 +0000 UTC m=+2016.325243281" observedRunningTime="2026-03-17 00:55:42.03751924 +0000 UTC m=+2016.796971583" watchObservedRunningTime="2026-03-17 00:55:42.054644746 +0000 UTC m=+2016.814097069" Mar 17 00:55:47 crc kubenswrapper[4755]: I0317 00:55:47.077653 4755 generic.go:334] "Generic (PLEG): container finished" podID="633d8f49-d4ce-475d-841e-c5ca7261f61a" containerID="b4a18d48b0a0f1a14180cceee5940b187e1df76f2cc4fe30702a630f9fd21c54" exitCode=0 Mar 17 00:55:47 crc kubenswrapper[4755]: I0317 00:55:47.077801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" event={"ID":"633d8f49-d4ce-475d-841e-c5ca7261f61a","Type":"ContainerDied","Data":"b4a18d48b0a0f1a14180cceee5940b187e1df76f2cc4fe30702a630f9fd21c54"} Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.571138 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.664028 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory\") pod \"633d8f49-d4ce-475d-841e-c5ca7261f61a\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.664154 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam\") pod \"633d8f49-d4ce-475d-841e-c5ca7261f61a\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.664230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fsdn\" (UniqueName: \"kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn\") pod \"633d8f49-d4ce-475d-841e-c5ca7261f61a\" (UID: \"633d8f49-d4ce-475d-841e-c5ca7261f61a\") " Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.672083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn" (OuterVolumeSpecName: "kube-api-access-6fsdn") pod "633d8f49-d4ce-475d-841e-c5ca7261f61a" (UID: "633d8f49-d4ce-475d-841e-c5ca7261f61a"). InnerVolumeSpecName "kube-api-access-6fsdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.698850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory" (OuterVolumeSpecName: "inventory") pod "633d8f49-d4ce-475d-841e-c5ca7261f61a" (UID: "633d8f49-d4ce-475d-841e-c5ca7261f61a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.699280 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "633d8f49-d4ce-475d-841e-c5ca7261f61a" (UID: "633d8f49-d4ce-475d-841e-c5ca7261f61a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.766540 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.766586 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/633d8f49-d4ce-475d-841e-c5ca7261f61a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:48 crc kubenswrapper[4755]: I0317 00:55:48.766604 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fsdn\" (UniqueName: \"kubernetes.io/projected/633d8f49-d4ce-475d-841e-c5ca7261f61a-kube-api-access-6fsdn\") on node \"crc\" DevicePath \"\"" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.107297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" event={"ID":"633d8f49-d4ce-475d-841e-c5ca7261f61a","Type":"ContainerDied","Data":"4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3"} Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.107745 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f5c57cc256e272eb8cd1f43a6e5544e99edb5cff21b9f97a64b5cb7b62e22f3" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.107413 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.196412 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g"] Mar 17 00:55:49 crc kubenswrapper[4755]: E0317 00:55:49.200324 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="633d8f49-d4ce-475d-841e-c5ca7261f61a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.200361 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="633d8f49-d4ce-475d-841e-c5ca7261f61a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.200744 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="633d8f49-d4ce-475d-841e-c5ca7261f61a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.201978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.204923 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.206903 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.206903 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.206914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.211174 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g"] Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.394111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.394211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.394302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7qn\" (UniqueName: \"kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.496316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.496628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.496862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7qn\" (UniqueName: \"kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.504880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.515794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.516404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7qn\" (UniqueName: \"kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:49 crc kubenswrapper[4755]: I0317 00:55:49.522500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:55:50 crc kubenswrapper[4755]: I0317 00:55:50.137596 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g"] Mar 17 00:55:51 crc kubenswrapper[4755]: I0317 00:55:51.135752 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" event={"ID":"ffc27979-4c48-4516-a6ab-f78e066dcc17","Type":"ContainerStarted","Data":"2fcb85a0d27bb802961885c8232dcebfb4bb33e80a312cd5f49697b248b9d89f"} Mar 17 00:55:51 crc kubenswrapper[4755]: I0317 00:55:51.136668 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" event={"ID":"ffc27979-4c48-4516-a6ab-f78e066dcc17","Type":"ContainerStarted","Data":"56d46c0270ed20bea6f60537fd5fe9e61c9e05f42cd4d3568a0a3df896db0974"} Mar 17 00:55:51 crc kubenswrapper[4755]: I0317 00:55:51.166259 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" podStartSLOduration=1.651212709 podStartE2EDuration="2.166235229s" podCreationTimestamp="2026-03-17 00:55:49 +0000 UTC" firstStartedPulling="2026-03-17 00:55:50.148789861 +0000 UTC m=+2024.908242154" lastFinishedPulling="2026-03-17 00:55:50.663812351 +0000 UTC m=+2025.423264674" observedRunningTime="2026-03-17 00:55:51.156835843 +0000 UTC m=+2025.916288156" watchObservedRunningTime="2026-03-17 00:55:51.166235229 +0000 UTC m=+2025.925687522" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.087363 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hrkzh"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.104098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8bgbn"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.120476 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-h9477"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.129185 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-571c-account-create-update-vrrc6"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.137722 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8bgbn"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.146040 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hrkzh"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.153977 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-h9477"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.161724 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-571c-account-create-update-vrrc6"] Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.258064 4755 scope.go:117] "RemoveContainer" containerID="d57445204ca1a1169e05f3848feb0d6df6d59be07dea604f588b48c49fc4ff21" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.262545 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c5ef244-d2dc-4cc0-bc15-8f542eb8a586" path="/var/lib/kubelet/pods/2c5ef244-d2dc-4cc0-bc15-8f542eb8a586/volumes" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.264245 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3811ee16-75c3-4ca3-9648-d3c9d5f8b028" path="/var/lib/kubelet/pods/3811ee16-75c3-4ca3-9648-d3c9d5f8b028/volumes" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.266860 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6" path="/var/lib/kubelet/pods/3c0288c3-d1a9-42ce-a561-b6a9ecb82dd6/volumes" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.268654 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a41a76-23dc-404d-b304-19ad4221ce3d" path="/var/lib/kubelet/pods/c3a41a76-23dc-404d-b304-19ad4221ce3d/volumes" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.313914 4755 scope.go:117] "RemoveContainer" containerID="4d8ca082cc7aa1159aaf9d718f07d36e1c7d56fa3230e8a6ed9391c74f9263fe" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.387596 4755 scope.go:117] "RemoveContainer" containerID="c00af56ff437bc2e7211e5a43b1ad2aa87fb9cf18c59cc2b9790b5a32c076e2b" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.452758 4755 scope.go:117] "RemoveContainer" containerID="f14028f30cce353568585bc000ef4e43117603a9a88c8b3842b745a1a48ec067" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.518394 4755 scope.go:117] "RemoveContainer" containerID="e5fe31b2f16c24183bd88dee6944e309af9190e1fbb6423f2b4e81797f8d3670" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.563033 4755 scope.go:117] "RemoveContainer" containerID="690cced1a5a0dbfb02077ba065f80fc5822805a01aceb0b9a44613b0512fb953" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.610010 4755 scope.go:117] "RemoveContainer" containerID="354c32f219635da720beb2dcc98b1faf3ad99b7ef375f78c3d4fac7e53ba25eb" Mar 17 00:55:58 crc kubenswrapper[4755]: I0317 00:55:58.646906 4755 scope.go:117] "RemoveContainer" containerID="7aa93298a9343f5eaae438f3edc1e11b0ec7c5c9ee4aded5418255f992555c03" Mar 17 00:55:59 crc kubenswrapper[4755]: I0317 00:55:59.032957 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-16ab-account-create-update-jd78k"] Mar 17 00:55:59 crc kubenswrapper[4755]: I0317 00:55:59.052278 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3bde-account-create-update-4xt5d"] Mar 17 00:55:59 crc kubenswrapper[4755]: I0317 00:55:59.068979 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-16ab-account-create-update-jd78k"] Mar 17 00:55:59 crc kubenswrapper[4755]: I0317 00:55:59.083114 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3bde-account-create-update-4xt5d"] Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.159676 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561816-vqpcs"] Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.162629 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.167546 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.167846 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.168165 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.178807 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561816-vqpcs"] Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.274847 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e" path="/var/lib/kubelet/pods/2cb4a3c9-fbf9-49fd-ae0b-e4f49c1f423e/volumes" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.276393 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f802ad4b-3ce3-451c-a5a3-99ca8a050644" path="/var/lib/kubelet/pods/f802ad4b-3ce3-451c-a5a3-99ca8a050644/volumes" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.285670 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqf5\" (UniqueName: \"kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5\") pod \"auto-csr-approver-29561816-vqpcs\" (UID: \"af91a237-9ed9-4565-95e7-f8238b07acb6\") " pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.389560 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqf5\" (UniqueName: \"kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5\") pod \"auto-csr-approver-29561816-vqpcs\" (UID: \"af91a237-9ed9-4565-95e7-f8238b07acb6\") " pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.411251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqf5\" (UniqueName: \"kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5\") pod \"auto-csr-approver-29561816-vqpcs\" (UID: \"af91a237-9ed9-4565-95e7-f8238b07acb6\") " pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:00 crc kubenswrapper[4755]: I0317 00:56:00.528741 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:01 crc kubenswrapper[4755]: W0317 00:56:01.072330 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf91a237_9ed9_4565_95e7_f8238b07acb6.slice/crio-3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c WatchSource:0}: Error finding container 3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c: Status 404 returned error can't find the container with id 3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c Mar 17 00:56:01 crc kubenswrapper[4755]: I0317 00:56:01.074722 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561816-vqpcs"] Mar 17 00:56:01 crc kubenswrapper[4755]: I0317 00:56:01.260144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" event={"ID":"af91a237-9ed9-4565-95e7-f8238b07acb6","Type":"ContainerStarted","Data":"3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c"} Mar 17 00:56:03 crc kubenswrapper[4755]: I0317 00:56:03.285376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" event={"ID":"af91a237-9ed9-4565-95e7-f8238b07acb6","Type":"ContainerStarted","Data":"7a4522abdf56efa94c8540dfec50df83e7fdcdd6ae4c8818ff3e2a581c0ad9bb"} Mar 17 00:56:03 crc kubenswrapper[4755]: I0317 00:56:03.303959 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" podStartSLOduration=1.690031007 podStartE2EDuration="3.30394112s" podCreationTimestamp="2026-03-17 00:56:00 +0000 UTC" firstStartedPulling="2026-03-17 00:56:01.075951922 +0000 UTC m=+2035.835404205" lastFinishedPulling="2026-03-17 00:56:02.689862035 +0000 UTC m=+2037.449314318" observedRunningTime="2026-03-17 00:56:03.299312884 +0000 UTC m=+2038.058765177" watchObservedRunningTime="2026-03-17 00:56:03.30394112 +0000 UTC m=+2038.063393413" Mar 17 00:56:04 crc kubenswrapper[4755]: I0317 00:56:04.300128 4755 generic.go:334] "Generic (PLEG): container finished" podID="af91a237-9ed9-4565-95e7-f8238b07acb6" containerID="7a4522abdf56efa94c8540dfec50df83e7fdcdd6ae4c8818ff3e2a581c0ad9bb" exitCode=0 Mar 17 00:56:04 crc kubenswrapper[4755]: I0317 00:56:04.300391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" event={"ID":"af91a237-9ed9-4565-95e7-f8238b07acb6","Type":"ContainerDied","Data":"7a4522abdf56efa94c8540dfec50df83e7fdcdd6ae4c8818ff3e2a581c0ad9bb"} Mar 17 00:56:05 crc kubenswrapper[4755]: I0317 00:56:05.770932 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:05 crc kubenswrapper[4755]: I0317 00:56:05.937783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swqf5\" (UniqueName: \"kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5\") pod \"af91a237-9ed9-4565-95e7-f8238b07acb6\" (UID: \"af91a237-9ed9-4565-95e7-f8238b07acb6\") " Mar 17 00:56:05 crc kubenswrapper[4755]: I0317 00:56:05.944425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5" (OuterVolumeSpecName: "kube-api-access-swqf5") pod "af91a237-9ed9-4565-95e7-f8238b07acb6" (UID: "af91a237-9ed9-4565-95e7-f8238b07acb6"). InnerVolumeSpecName "kube-api-access-swqf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.042103 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swqf5\" (UniqueName: \"kubernetes.io/projected/af91a237-9ed9-4565-95e7-f8238b07acb6-kube-api-access-swqf5\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.326705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" event={"ID":"af91a237-9ed9-4565-95e7-f8238b07acb6","Type":"ContainerDied","Data":"3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c"} Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.327123 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aa3818a831b562fc77cd1ccb7d2a032b8c34d0126ef514ab0385b70b0fbee0c" Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.326959 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561816-vqpcs" Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.365987 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561810-tgzzp"] Mar 17 00:56:06 crc kubenswrapper[4755]: I0317 00:56:06.375910 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561810-tgzzp"] Mar 17 00:56:08 crc kubenswrapper[4755]: I0317 00:56:08.264150 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c560869-7a4d-41f5-a9aa-3d57d3f0be2e" path="/var/lib/kubelet/pods/1c560869-7a4d-41f5-a9aa-3d57d3f0be2e/volumes" Mar 17 00:56:31 crc kubenswrapper[4755]: I0317 00:56:31.066021 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r6dbj"] Mar 17 00:56:31 crc kubenswrapper[4755]: I0317 00:56:31.076075 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r6dbj"] Mar 17 00:56:32 crc kubenswrapper[4755]: I0317 00:56:32.270946 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce90939-ded2-4efa-90c9-c74df06b5bcd" path="/var/lib/kubelet/pods/cce90939-ded2-4efa-90c9-c74df06b5bcd/volumes" Mar 17 00:56:43 crc kubenswrapper[4755]: I0317 00:56:43.923488 4755 generic.go:334] "Generic (PLEG): container finished" podID="ffc27979-4c48-4516-a6ab-f78e066dcc17" containerID="2fcb85a0d27bb802961885c8232dcebfb4bb33e80a312cd5f49697b248b9d89f" exitCode=0 Mar 17 00:56:43 crc kubenswrapper[4755]: I0317 00:56:43.923630 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" event={"ID":"ffc27979-4c48-4516-a6ab-f78e066dcc17","Type":"ContainerDied","Data":"2fcb85a0d27bb802961885c8232dcebfb4bb33e80a312cd5f49697b248b9d89f"} Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.487477 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.543636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam\") pod \"ffc27979-4c48-4516-a6ab-f78e066dcc17\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.543707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory\") pod \"ffc27979-4c48-4516-a6ab-f78e066dcc17\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.543729 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q7qn\" (UniqueName: \"kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn\") pod \"ffc27979-4c48-4516-a6ab-f78e066dcc17\" (UID: \"ffc27979-4c48-4516-a6ab-f78e066dcc17\") " Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.549949 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn" (OuterVolumeSpecName: "kube-api-access-8q7qn") pod "ffc27979-4c48-4516-a6ab-f78e066dcc17" (UID: "ffc27979-4c48-4516-a6ab-f78e066dcc17"). InnerVolumeSpecName "kube-api-access-8q7qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.579344 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ffc27979-4c48-4516-a6ab-f78e066dcc17" (UID: "ffc27979-4c48-4516-a6ab-f78e066dcc17"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.581851 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory" (OuterVolumeSpecName: "inventory") pod "ffc27979-4c48-4516-a6ab-f78e066dcc17" (UID: "ffc27979-4c48-4516-a6ab-f78e066dcc17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.645839 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.645870 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffc27979-4c48-4516-a6ab-f78e066dcc17-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.645879 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q7qn\" (UniqueName: \"kubernetes.io/projected/ffc27979-4c48-4516-a6ab-f78e066dcc17-kube-api-access-8q7qn\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.957398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" event={"ID":"ffc27979-4c48-4516-a6ab-f78e066dcc17","Type":"ContainerDied","Data":"56d46c0270ed20bea6f60537fd5fe9e61c9e05f42cd4d3568a0a3df896db0974"} Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.957484 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d46c0270ed20bea6f60537fd5fe9e61c9e05f42cd4d3568a0a3df896db0974" Mar 17 00:56:45 crc kubenswrapper[4755]: I0317 00:56:45.957535 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.098603 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5lm87"] Mar 17 00:56:46 crc kubenswrapper[4755]: E0317 00:56:46.099234 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af91a237-9ed9-4565-95e7-f8238b07acb6" containerName="oc" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.099255 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="af91a237-9ed9-4565-95e7-f8238b07acb6" containerName="oc" Mar 17 00:56:46 crc kubenswrapper[4755]: E0317 00:56:46.099281 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc27979-4c48-4516-a6ab-f78e066dcc17" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.099292 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc27979-4c48-4516-a6ab-f78e066dcc17" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.099674 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="af91a237-9ed9-4565-95e7-f8238b07acb6" containerName="oc" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.099710 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc27979-4c48-4516-a6ab-f78e066dcc17" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.100664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.103484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.104842 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.105028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.105381 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.110350 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5lm87"] Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.158240 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.158396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zqd9\" (UniqueName: \"kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.158519 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.260822 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.260890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zqd9\" (UniqueName: \"kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.260926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.266091 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.267040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.280230 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zqd9\" (UniqueName: \"kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9\") pod \"ssh-known-hosts-edpm-deployment-5lm87\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:46 crc kubenswrapper[4755]: I0317 00:56:46.442091 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:47 crc kubenswrapper[4755]: I0317 00:56:47.092699 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5lm87"] Mar 17 00:56:47 crc kubenswrapper[4755]: W0317 00:56:47.094930 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod407f6cc9_ea7c_455d_90bb_78266b1e6783.slice/crio-9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170 WatchSource:0}: Error finding container 9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170: Status 404 returned error can't find the container with id 9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170 Mar 17 00:56:47 crc kubenswrapper[4755]: I0317 00:56:47.984039 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" event={"ID":"407f6cc9-ea7c-455d-90bb-78266b1e6783","Type":"ContainerStarted","Data":"345fb61befa7824b0566f4f85a41736d42d11b440c3edab7342208a2cb2a1d01"} Mar 17 00:56:47 crc kubenswrapper[4755]: I0317 00:56:47.984518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" event={"ID":"407f6cc9-ea7c-455d-90bb-78266b1e6783","Type":"ContainerStarted","Data":"9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170"} Mar 17 00:56:48 crc kubenswrapper[4755]: I0317 00:56:48.024857 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" podStartSLOduration=1.5829351489999999 podStartE2EDuration="2.02483333s" podCreationTimestamp="2026-03-17 00:56:46 +0000 UTC" firstStartedPulling="2026-03-17 00:56:47.099625142 +0000 UTC m=+2081.859077455" lastFinishedPulling="2026-03-17 00:56:47.541523313 +0000 UTC m=+2082.300975636" observedRunningTime="2026-03-17 00:56:48.008190627 +0000 UTC m=+2082.767642930" watchObservedRunningTime="2026-03-17 00:56:48.02483333 +0000 UTC m=+2082.784285623" Mar 17 00:56:48 crc kubenswrapper[4755]: I0317 00:56:48.047735 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7chjf"] Mar 17 00:56:48 crc kubenswrapper[4755]: I0317 00:56:48.077614 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7chjf"] Mar 17 00:56:48 crc kubenswrapper[4755]: I0317 00:56:48.268684 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db40418-904e-4974-b8d3-f23a2cb94080" path="/var/lib/kubelet/pods/7db40418-904e-4974-b8d3-f23a2cb94080/volumes" Mar 17 00:56:52 crc kubenswrapper[4755]: I0317 00:56:52.056169 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-48hw5"] Mar 17 00:56:52 crc kubenswrapper[4755]: I0317 00:56:52.070738 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-48hw5"] Mar 17 00:56:52 crc kubenswrapper[4755]: I0317 00:56:52.282855 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a98bfe-6430-4b2c-9cc4-4287439401b5" path="/var/lib/kubelet/pods/65a98bfe-6430-4b2c-9cc4-4287439401b5/volumes" Mar 17 00:56:55 crc kubenswrapper[4755]: I0317 00:56:55.086156 4755 generic.go:334] "Generic (PLEG): container finished" podID="407f6cc9-ea7c-455d-90bb-78266b1e6783" containerID="345fb61befa7824b0566f4f85a41736d42d11b440c3edab7342208a2cb2a1d01" exitCode=0 Mar 17 00:56:55 crc kubenswrapper[4755]: I0317 00:56:55.086250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" event={"ID":"407f6cc9-ea7c-455d-90bb-78266b1e6783","Type":"ContainerDied","Data":"345fb61befa7824b0566f4f85a41736d42d11b440c3edab7342208a2cb2a1d01"} Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.689177 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.825585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zqd9\" (UniqueName: \"kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9\") pod \"407f6cc9-ea7c-455d-90bb-78266b1e6783\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.825889 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam\") pod \"407f6cc9-ea7c-455d-90bb-78266b1e6783\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.826010 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0\") pod \"407f6cc9-ea7c-455d-90bb-78266b1e6783\" (UID: \"407f6cc9-ea7c-455d-90bb-78266b1e6783\") " Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.833913 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9" (OuterVolumeSpecName: "kube-api-access-2zqd9") pod "407f6cc9-ea7c-455d-90bb-78266b1e6783" (UID: "407f6cc9-ea7c-455d-90bb-78266b1e6783"). InnerVolumeSpecName "kube-api-access-2zqd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.856553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "407f6cc9-ea7c-455d-90bb-78266b1e6783" (UID: "407f6cc9-ea7c-455d-90bb-78266b1e6783"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.881730 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "407f6cc9-ea7c-455d-90bb-78266b1e6783" (UID: "407f6cc9-ea7c-455d-90bb-78266b1e6783"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.929168 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zqd9\" (UniqueName: \"kubernetes.io/projected/407f6cc9-ea7c-455d-90bb-78266b1e6783-kube-api-access-2zqd9\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.929488 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:56 crc kubenswrapper[4755]: I0317 00:56:56.929657 4755 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/407f6cc9-ea7c-455d-90bb-78266b1e6783-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.117953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" event={"ID":"407f6cc9-ea7c-455d-90bb-78266b1e6783","Type":"ContainerDied","Data":"9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170"} Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.117992 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f1f02f2271331514770891347a97abf4c746a7eedfa2c7e8264ab34b0378170" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.118100 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5lm87" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.241266 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws"] Mar 17 00:56:57 crc kubenswrapper[4755]: E0317 00:56:57.242250 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407f6cc9-ea7c-455d-90bb-78266b1e6783" containerName="ssh-known-hosts-edpm-deployment" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.242269 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="407f6cc9-ea7c-455d-90bb-78266b1e6783" containerName="ssh-known-hosts-edpm-deployment" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.242658 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="407f6cc9-ea7c-455d-90bb-78266b1e6783" containerName="ssh-known-hosts-edpm-deployment" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.243704 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.256006 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws"] Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.256403 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.256593 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.257652 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.258233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.378477 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.378600 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9bt\" (UniqueName: \"kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.378686 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.480625 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9bt\" (UniqueName: \"kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.480718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.480801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.484380 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.487695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.507281 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9bt\" (UniqueName: \"kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xwws\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:57 crc kubenswrapper[4755]: I0317 00:56:57.576733 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:56:58 crc kubenswrapper[4755]: I0317 00:56:58.182149 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws"] Mar 17 00:56:58 crc kubenswrapper[4755]: I0317 00:56:58.849735 4755 scope.go:117] "RemoveContainer" containerID="eb81201a2d160a3ff493a22cc87b2550a2f6e68915360659a932d17c4afdc5b8" Mar 17 00:56:58 crc kubenswrapper[4755]: I0317 00:56:58.986361 4755 scope.go:117] "RemoveContainer" containerID="aba64e1a808044f0a24814eda28fd0b2184327c9dd563b80e858e77591d387e5" Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.031851 4755 scope.go:117] "RemoveContainer" containerID="4c717171c7f61c51db6c1bfc2b3955f86a6fb1638a0d4e1689a6d097e29d1521" Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.033809 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-b6rgr"] Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.042352 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-bfd0-account-create-update-85zvr"] Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.051857 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-bfd0-account-create-update-85zvr"] Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.060276 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-b6rgr"] Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.088322 4755 scope.go:117] "RemoveContainer" containerID="50c512cdaee121322ce9a340513e51b6d250a091830bb1e38cb410c551420d75" Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.131433 4755 scope.go:117] "RemoveContainer" containerID="5a806caaf12ad40335ad10217e927d56da16c4dc530402dc6ae6eedcb44a2ce1" Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.152954 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" event={"ID":"1b682c2e-a6e2-478b-9679-5c2aaf416857","Type":"ContainerStarted","Data":"1cba464c62b67c2b8db0cd83e6d9ae1064bdb85f2cf05f2390a003292d6f12b1"} Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.153015 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" event={"ID":"1b682c2e-a6e2-478b-9679-5c2aaf416857","Type":"ContainerStarted","Data":"d001f028d40c32b2eb68d4e54cba3c5f76699575f8fa5dba68cff5c3720b862f"} Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.169980 4755 scope.go:117] "RemoveContainer" containerID="17408614c84f9a3809d69c3d47eba304724c756bf6d51f7772b56e073b2636bd" Mar 17 00:56:59 crc kubenswrapper[4755]: I0317 00:56:59.182627 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" podStartSLOduration=1.6231550559999999 podStartE2EDuration="2.182604934s" podCreationTimestamp="2026-03-17 00:56:57 +0000 UTC" firstStartedPulling="2026-03-17 00:56:58.205275118 +0000 UTC m=+2092.964727401" lastFinishedPulling="2026-03-17 00:56:58.764724956 +0000 UTC m=+2093.524177279" observedRunningTime="2026-03-17 00:56:59.170122475 +0000 UTC m=+2093.929574798" watchObservedRunningTime="2026-03-17 00:56:59.182604934 +0000 UTC m=+2093.942057227" Mar 17 00:57:00 crc kubenswrapper[4755]: I0317 00:57:00.265556 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecff6c47-8752-4ea4-9f9e-a6c4c2723181" path="/var/lib/kubelet/pods/ecff6c47-8752-4ea4-9f9e-a6c4c2723181/volumes" Mar 17 00:57:00 crc kubenswrapper[4755]: I0317 00:57:00.266737 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f389a9bf-dbdd-4a73-ab7c-dc25609792a2" path="/var/lib/kubelet/pods/f389a9bf-dbdd-4a73-ab7c-dc25609792a2/volumes" Mar 17 00:57:07 crc kubenswrapper[4755]: I0317 00:57:07.276147 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b682c2e-a6e2-478b-9679-5c2aaf416857" containerID="1cba464c62b67c2b8db0cd83e6d9ae1064bdb85f2cf05f2390a003292d6f12b1" exitCode=0 Mar 17 00:57:07 crc kubenswrapper[4755]: I0317 00:57:07.276306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" event={"ID":"1b682c2e-a6e2-478b-9679-5c2aaf416857","Type":"ContainerDied","Data":"1cba464c62b67c2b8db0cd83e6d9ae1064bdb85f2cf05f2390a003292d6f12b1"} Mar 17 00:57:08 crc kubenswrapper[4755]: I0317 00:57:08.807634 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:57:08 crc kubenswrapper[4755]: I0317 00:57:08.973835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx9bt\" (UniqueName: \"kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt\") pod \"1b682c2e-a6e2-478b-9679-5c2aaf416857\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " Mar 17 00:57:08 crc kubenswrapper[4755]: I0317 00:57:08.974038 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory\") pod \"1b682c2e-a6e2-478b-9679-5c2aaf416857\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " Mar 17 00:57:08 crc kubenswrapper[4755]: I0317 00:57:08.974103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam\") pod \"1b682c2e-a6e2-478b-9679-5c2aaf416857\" (UID: \"1b682c2e-a6e2-478b-9679-5c2aaf416857\") " Mar 17 00:57:08 crc kubenswrapper[4755]: I0317 00:57:08.981034 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt" (OuterVolumeSpecName: "kube-api-access-dx9bt") pod "1b682c2e-a6e2-478b-9679-5c2aaf416857" (UID: "1b682c2e-a6e2-478b-9679-5c2aaf416857"). InnerVolumeSpecName "kube-api-access-dx9bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.012422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1b682c2e-a6e2-478b-9679-5c2aaf416857" (UID: "1b682c2e-a6e2-478b-9679-5c2aaf416857"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.012773 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory" (OuterVolumeSpecName: "inventory") pod "1b682c2e-a6e2-478b-9679-5c2aaf416857" (UID: "1b682c2e-a6e2-478b-9679-5c2aaf416857"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.077166 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx9bt\" (UniqueName: \"kubernetes.io/projected/1b682c2e-a6e2-478b-9679-5c2aaf416857-kube-api-access-dx9bt\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.077212 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.077224 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b682c2e-a6e2-478b-9679-5c2aaf416857-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.303830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" event={"ID":"1b682c2e-a6e2-478b-9679-5c2aaf416857","Type":"ContainerDied","Data":"d001f028d40c32b2eb68d4e54cba3c5f76699575f8fa5dba68cff5c3720b862f"} Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.303878 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d001f028d40c32b2eb68d4e54cba3c5f76699575f8fa5dba68cff5c3720b862f" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.303893 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.377640 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb"] Mar 17 00:57:09 crc kubenswrapper[4755]: E0317 00:57:09.378066 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b682c2e-a6e2-478b-9679-5c2aaf416857" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.378083 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b682c2e-a6e2-478b-9679-5c2aaf416857" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.378310 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b682c2e-a6e2-478b-9679-5c2aaf416857" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.379180 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.383626 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.384296 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.384567 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.385216 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.395002 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb"] Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.485245 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.485313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.485588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4df\" (UniqueName: \"kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.587231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4df\" (UniqueName: \"kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.587361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.587400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.592139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.592980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.614379 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4df\" (UniqueName: \"kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:09 crc kubenswrapper[4755]: I0317 00:57:09.714795 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:10 crc kubenswrapper[4755]: I0317 00:57:10.404211 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb"] Mar 17 00:57:11 crc kubenswrapper[4755]: I0317 00:57:11.331525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" event={"ID":"eecc6a07-bab0-487c-8948-3ed46324a72f","Type":"ContainerStarted","Data":"52eab38fda4ad018fbd1b698102709d31c39c1b7e40a0fcd138bd88507a90303"} Mar 17 00:57:11 crc kubenswrapper[4755]: I0317 00:57:11.331928 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" event={"ID":"eecc6a07-bab0-487c-8948-3ed46324a72f","Type":"ContainerStarted","Data":"b6e6611cf02f588c083f9ba99eb8d760dfb968b6bb118605c0fcd5371f7ae218"} Mar 17 00:57:11 crc kubenswrapper[4755]: I0317 00:57:11.364726 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" podStartSLOduration=1.9627633690000001 podStartE2EDuration="2.364699293s" podCreationTimestamp="2026-03-17 00:57:09 +0000 UTC" firstStartedPulling="2026-03-17 00:57:10.415790481 +0000 UTC m=+2105.175242814" lastFinishedPulling="2026-03-17 00:57:10.817726455 +0000 UTC m=+2105.577178738" observedRunningTime="2026-03-17 00:57:11.35060383 +0000 UTC m=+2106.110056133" watchObservedRunningTime="2026-03-17 00:57:11.364699293 +0000 UTC m=+2106.124151616" Mar 17 00:57:20 crc kubenswrapper[4755]: I0317 00:57:20.480410 4755 generic.go:334] "Generic (PLEG): container finished" podID="eecc6a07-bab0-487c-8948-3ed46324a72f" containerID="52eab38fda4ad018fbd1b698102709d31c39c1b7e40a0fcd138bd88507a90303" exitCode=0 Mar 17 00:57:20 crc kubenswrapper[4755]: I0317 00:57:20.480488 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" event={"ID":"eecc6a07-bab0-487c-8948-3ed46324a72f","Type":"ContainerDied","Data":"52eab38fda4ad018fbd1b698102709d31c39c1b7e40a0fcd138bd88507a90303"} Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.326512 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.371366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam\") pod \"eecc6a07-bab0-487c-8948-3ed46324a72f\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.371504 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory\") pod \"eecc6a07-bab0-487c-8948-3ed46324a72f\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.371661 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w4df\" (UniqueName: \"kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df\") pod \"eecc6a07-bab0-487c-8948-3ed46324a72f\" (UID: \"eecc6a07-bab0-487c-8948-3ed46324a72f\") " Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.385746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df" (OuterVolumeSpecName: "kube-api-access-7w4df") pod "eecc6a07-bab0-487c-8948-3ed46324a72f" (UID: "eecc6a07-bab0-487c-8948-3ed46324a72f"). InnerVolumeSpecName "kube-api-access-7w4df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.409285 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory" (OuterVolumeSpecName: "inventory") pod "eecc6a07-bab0-487c-8948-3ed46324a72f" (UID: "eecc6a07-bab0-487c-8948-3ed46324a72f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.410940 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eecc6a07-bab0-487c-8948-3ed46324a72f" (UID: "eecc6a07-bab0-487c-8948-3ed46324a72f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.474627 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.474664 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eecc6a07-bab0-487c-8948-3ed46324a72f-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.474678 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w4df\" (UniqueName: \"kubernetes.io/projected/eecc6a07-bab0-487c-8948-3ed46324a72f-kube-api-access-7w4df\") on node \"crc\" DevicePath \"\"" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.508142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" event={"ID":"eecc6a07-bab0-487c-8948-3ed46324a72f","Type":"ContainerDied","Data":"b6e6611cf02f588c083f9ba99eb8d760dfb968b6bb118605c0fcd5371f7ae218"} Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.508190 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e6611cf02f588c083f9ba99eb8d760dfb968b6bb118605c0fcd5371f7ae218" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.508196 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.591494 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs"] Mar 17 00:57:22 crc kubenswrapper[4755]: E0317 00:57:22.592069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecc6a07-bab0-487c-8948-3ed46324a72f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.592095 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecc6a07-bab0-487c-8948-3ed46324a72f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.592395 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecc6a07-bab0-487c-8948-3ed46324a72f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.593396 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.596766 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.596791 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.596891 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.596915 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.597189 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.597375 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.599214 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.607020 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.607635 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs"] Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.677725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.677984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678127 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678329 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678464 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678533 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qptv\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678602 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.678739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780386 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qptv\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780483 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780503 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.780650 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.781321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.781346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.786812 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.787271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.788292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.790969 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.791843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.791943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.791982 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.792275 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.792742 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.794160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.794428 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.800226 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.802660 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qptv\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:22 crc kubenswrapper[4755]: I0317 00:57:22.912703 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:57:23 crc kubenswrapper[4755]: I0317 00:57:23.624273 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs"] Mar 17 00:57:24 crc kubenswrapper[4755]: I0317 00:57:24.539359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" event={"ID":"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c","Type":"ContainerStarted","Data":"2416256bb82ee86e5f695036809e20960838ad5000395eccaf09ee8c3bf17784"} Mar 17 00:57:24 crc kubenswrapper[4755]: I0317 00:57:24.540124 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" event={"ID":"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c","Type":"ContainerStarted","Data":"01745ab83069e0ded45aec367edb25064d3efbcbf6e1b59d0d354a3ef652e9cf"} Mar 17 00:57:24 crc kubenswrapper[4755]: I0317 00:57:24.567414 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" podStartSLOduration=2.095647554 podStartE2EDuration="2.567390276s" podCreationTimestamp="2026-03-17 00:57:22 +0000 UTC" firstStartedPulling="2026-03-17 00:57:23.634002955 +0000 UTC m=+2118.393455278" lastFinishedPulling="2026-03-17 00:57:24.105745687 +0000 UTC m=+2118.865198000" observedRunningTime="2026-03-17 00:57:24.558203505 +0000 UTC m=+2119.317655788" watchObservedRunningTime="2026-03-17 00:57:24.567390276 +0000 UTC m=+2119.326842599" Mar 17 00:57:28 crc kubenswrapper[4755]: I0317 00:57:28.665728 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:57:28 crc kubenswrapper[4755]: I0317 00:57:28.667638 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:57:35 crc kubenswrapper[4755]: I0317 00:57:35.055610 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-znj9l"] Mar 17 00:57:35 crc kubenswrapper[4755]: I0317 00:57:35.066497 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-znj9l"] Mar 17 00:57:36 crc kubenswrapper[4755]: I0317 00:57:36.264026 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ccacd0-c90a-4280-a907-a5b43b82744d" path="/var/lib/kubelet/pods/47ccacd0-c90a-4280-a907-a5b43b82744d/volumes" Mar 17 00:57:48 crc kubenswrapper[4755]: I0317 00:57:48.909692 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:57:48 crc kubenswrapper[4755]: I0317 00:57:48.916628 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:48 crc kubenswrapper[4755]: I0317 00:57:48.936342 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.095026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.095117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.095309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pkgd\" (UniqueName: \"kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.196987 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.197086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pkgd\" (UniqueName: \"kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.197229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.198096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.198103 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.228499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pkgd\" (UniqueName: \"kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd\") pod \"redhat-marketplace-6vg7d\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.255552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.762217 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:57:49 crc kubenswrapper[4755]: I0317 00:57:49.917434 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerStarted","Data":"2dae4f8cf0eb31c0b30efd94c41a11cdcf5de315b04c93efddcbf064aff2bce7"} Mar 17 00:57:50 crc kubenswrapper[4755]: I0317 00:57:50.930504 4755 generic.go:334] "Generic (PLEG): container finished" podID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerID="e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013" exitCode=0 Mar 17 00:57:50 crc kubenswrapper[4755]: I0317 00:57:50.930635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerDied","Data":"e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013"} Mar 17 00:57:52 crc kubenswrapper[4755]: I0317 00:57:52.951817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerStarted","Data":"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def"} Mar 17 00:57:53 crc kubenswrapper[4755]: I0317 00:57:53.962818 4755 generic.go:334] "Generic (PLEG): container finished" podID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerID="5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def" exitCode=0 Mar 17 00:57:53 crc kubenswrapper[4755]: I0317 00:57:53.963018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerDied","Data":"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def"} Mar 17 00:57:54 crc kubenswrapper[4755]: I0317 00:57:54.975536 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerStarted","Data":"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca"} Mar 17 00:57:55 crc kubenswrapper[4755]: I0317 00:57:55.002556 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vg7d" podStartSLOduration=3.529602059 podStartE2EDuration="7.002534354s" podCreationTimestamp="2026-03-17 00:57:48 +0000 UTC" firstStartedPulling="2026-03-17 00:57:50.93263983 +0000 UTC m=+2145.692092153" lastFinishedPulling="2026-03-17 00:57:54.405572165 +0000 UTC m=+2149.165024448" observedRunningTime="2026-03-17 00:57:54.999372818 +0000 UTC m=+2149.758825101" watchObservedRunningTime="2026-03-17 00:57:55.002534354 +0000 UTC m=+2149.761986647" Mar 17 00:57:58 crc kubenswrapper[4755]: I0317 00:57:58.664967 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:57:58 crc kubenswrapper[4755]: I0317 00:57:58.665399 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.255785 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.255866 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.335254 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.357313 4755 scope.go:117] "RemoveContainer" containerID="6fbab48bac90b935818c345d316cbcf4b4684392c5d17f98a0c16de25a6b47fb" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.396622 4755 scope.go:117] "RemoveContainer" containerID="200d1dc69bb32849dcf63b582c5b869b6a870c2e32711dffc06f46007f10389d" Mar 17 00:57:59 crc kubenswrapper[4755]: I0317 00:57:59.492079 4755 scope.go:117] "RemoveContainer" containerID="f710b331bc4538c477848fbbdaf31b75703cf2cc9bea9ddd89c8af91ca14b0f1" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.091744 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.169046 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.182475 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561818-l6l4d"] Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.184763 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.192532 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.192701 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.192753 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.193196 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561818-l6l4d"] Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.265397 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbnd\" (UniqueName: \"kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd\") pod \"auto-csr-approver-29561818-l6l4d\" (UID: \"bec04782-7ac5-4e8c-92a6-43843b88db96\") " pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.367206 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbnd\" (UniqueName: \"kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd\") pod \"auto-csr-approver-29561818-l6l4d\" (UID: \"bec04782-7ac5-4e8c-92a6-43843b88db96\") " pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.386327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbnd\" (UniqueName: \"kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd\") pod \"auto-csr-approver-29561818-l6l4d\" (UID: \"bec04782-7ac5-4e8c-92a6-43843b88db96\") " pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:00 crc kubenswrapper[4755]: I0317 00:58:00.512279 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:01 crc kubenswrapper[4755]: I0317 00:58:01.053244 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561818-l6l4d"] Mar 17 00:58:01 crc kubenswrapper[4755]: I0317 00:58:01.055076 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" event={"ID":"bec04782-7ac5-4e8c-92a6-43843b88db96","Type":"ContainerStarted","Data":"d60c04368ca1f7271c3d9b332f331e4a974d4c2545c0b801496cada1b2bf8491"} Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.070888 4755 generic.go:334] "Generic (PLEG): container finished" podID="7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" containerID="2416256bb82ee86e5f695036809e20960838ad5000395eccaf09ee8c3bf17784" exitCode=0 Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.072739 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vg7d" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="registry-server" containerID="cri-o://b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca" gracePeriod=2 Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.071139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" event={"ID":"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c","Type":"ContainerDied","Data":"2416256bb82ee86e5f695036809e20960838ad5000395eccaf09ee8c3bf17784"} Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.673041 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.720253 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities\") pod \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.720400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content\") pod \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.720430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pkgd\" (UniqueName: \"kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd\") pod \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\" (UID: \"29a2d03a-f95d-4414-8f6a-eee0acd4e297\") " Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.720994 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities" (OuterVolumeSpecName: "utilities") pod "29a2d03a-f95d-4414-8f6a-eee0acd4e297" (UID: "29a2d03a-f95d-4414-8f6a-eee0acd4e297"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.721850 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.725473 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd" (OuterVolumeSpecName: "kube-api-access-8pkgd") pod "29a2d03a-f95d-4414-8f6a-eee0acd4e297" (UID: "29a2d03a-f95d-4414-8f6a-eee0acd4e297"). InnerVolumeSpecName "kube-api-access-8pkgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.755002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29a2d03a-f95d-4414-8f6a-eee0acd4e297" (UID: "29a2d03a-f95d-4414-8f6a-eee0acd4e297"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.824294 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29a2d03a-f95d-4414-8f6a-eee0acd4e297-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:02 crc kubenswrapper[4755]: I0317 00:58:02.824335 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pkgd\" (UniqueName: \"kubernetes.io/projected/29a2d03a-f95d-4414-8f6a-eee0acd4e297-kube-api-access-8pkgd\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.082975 4755 generic.go:334] "Generic (PLEG): container finished" podID="bec04782-7ac5-4e8c-92a6-43843b88db96" containerID="5adb7843fbe74ca66aaf66bc112ba9cb867f0db097eeefa0d053a43f6fb883c9" exitCode=0 Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.083103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" event={"ID":"bec04782-7ac5-4e8c-92a6-43843b88db96","Type":"ContainerDied","Data":"5adb7843fbe74ca66aaf66bc112ba9cb867f0db097eeefa0d053a43f6fb883c9"} Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.086099 4755 generic.go:334] "Generic (PLEG): container finished" podID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerID="b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca" exitCode=0 Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.086233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vg7d" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.086302 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerDied","Data":"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca"} Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.086393 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vg7d" event={"ID":"29a2d03a-f95d-4414-8f6a-eee0acd4e297","Type":"ContainerDied","Data":"2dae4f8cf0eb31c0b30efd94c41a11cdcf5de315b04c93efddcbf064aff2bce7"} Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.086456 4755 scope.go:117] "RemoveContainer" containerID="b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.112428 4755 scope.go:117] "RemoveContainer" containerID="5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.155040 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.164143 4755 scope.go:117] "RemoveContainer" containerID="e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.168752 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vg7d"] Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.198312 4755 scope.go:117] "RemoveContainer" containerID="b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca" Mar 17 00:58:03 crc kubenswrapper[4755]: E0317 00:58:03.216726 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca\": container with ID starting with b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca not found: ID does not exist" containerID="b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.216786 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca"} err="failed to get container status \"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca\": rpc error: code = NotFound desc = could not find container \"b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca\": container with ID starting with b5462f334b4830f9bb21e0a2560c8677254fb7c3d7e6e16bfbf91b68a35838ca not found: ID does not exist" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.216820 4755 scope.go:117] "RemoveContainer" containerID="5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def" Mar 17 00:58:03 crc kubenswrapper[4755]: E0317 00:58:03.220809 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def\": container with ID starting with 5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def not found: ID does not exist" containerID="5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.221116 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def"} err="failed to get container status \"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def\": rpc error: code = NotFound desc = could not find container \"5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def\": container with ID starting with 5c8afe40fd73d91142bdadb449c0c681c12007fdfa0d95c351118520bc897def not found: ID does not exist" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.221154 4755 scope.go:117] "RemoveContainer" containerID="e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013" Mar 17 00:58:03 crc kubenswrapper[4755]: E0317 00:58:03.222948 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013\": container with ID starting with e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013 not found: ID does not exist" containerID="e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.222987 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013"} err="failed to get container status \"e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013\": rpc error: code = NotFound desc = could not find container \"e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013\": container with ID starting with e336963b076f5adb44d883397390f1a7e9b41a07fc4852af5739e0c7db785013 not found: ID does not exist" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.596947 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.655892 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.655955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656014 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656056 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656149 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656171 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qptv\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656194 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656212 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656231 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656245 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656345 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.656377 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle\") pod \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\" (UID: \"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c\") " Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.663277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.663331 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv" (OuterVolumeSpecName: "kube-api-access-8qptv") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "kube-api-access-8qptv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.665834 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.665810 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.666086 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.666308 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.666876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.666976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.667794 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.669353 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.675607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.694961 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.696067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory" (OuterVolumeSpecName: "inventory") pod "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" (UID: "7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759646 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759701 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759723 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759743 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759764 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qptv\" (UniqueName: \"kubernetes.io/projected/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-kube-api-access-8qptv\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759783 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759801 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759818 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759837 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759856 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759873 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759890 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:03 crc kubenswrapper[4755]: I0317 00:58:03.759908 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.102794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" event={"ID":"7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c","Type":"ContainerDied","Data":"01745ab83069e0ded45aec367edb25064d3efbcbf6e1b59d0d354a3ef652e9cf"} Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.102881 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01745ab83069e0ded45aec367edb25064d3efbcbf6e1b59d0d354a3ef652e9cf" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.102828 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.211220 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92"] Mar 17 00:58:04 crc kubenswrapper[4755]: E0317 00:58:04.212193 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="registry-server" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.212423 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="registry-server" Mar 17 00:58:04 crc kubenswrapper[4755]: E0317 00:58:04.212509 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="extract-content" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.212556 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="extract-content" Mar 17 00:58:04 crc kubenswrapper[4755]: E0317 00:58:04.212657 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.212707 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 00:58:04 crc kubenswrapper[4755]: E0317 00:58:04.212772 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="extract-utilities" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.212817 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="extract-utilities" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.213293 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" containerName="registry-server" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.213355 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.214422 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.217214 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.217221 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.217319 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.217406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.217710 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.240600 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92"] Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.264365 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a2d03a-f95d-4414-8f6a-eee0acd4e297" path="/var/lib/kubelet/pods/29a2d03a-f95d-4414-8f6a-eee0acd4e297/volumes" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.273100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz5w5\" (UniqueName: \"kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.273420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.273509 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.273583 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.273925 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.376385 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz5w5\" (UniqueName: \"kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.376962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.377007 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.377083 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.377327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.378013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.382945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.387276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.394583 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.400733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz5w5\" (UniqueName: \"kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2dm92\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.538102 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.555945 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.682616 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zbnd\" (UniqueName: \"kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd\") pod \"bec04782-7ac5-4e8c-92a6-43843b88db96\" (UID: \"bec04782-7ac5-4e8c-92a6-43843b88db96\") " Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.689090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd" (OuterVolumeSpecName: "kube-api-access-2zbnd") pod "bec04782-7ac5-4e8c-92a6-43843b88db96" (UID: "bec04782-7ac5-4e8c-92a6-43843b88db96"). InnerVolumeSpecName "kube-api-access-2zbnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.786134 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zbnd\" (UniqueName: \"kubernetes.io/projected/bec04782-7ac5-4e8c-92a6-43843b88db96-kube-api-access-2zbnd\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:04 crc kubenswrapper[4755]: W0317 00:58:04.916790 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e8b58f4_3072_450a_afae_2d18d9f34848.slice/crio-802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0 WatchSource:0}: Error finding container 802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0: Status 404 returned error can't find the container with id 802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0 Mar 17 00:58:04 crc kubenswrapper[4755]: I0317 00:58:04.916981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92"] Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.119886 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" event={"ID":"bec04782-7ac5-4e8c-92a6-43843b88db96","Type":"ContainerDied","Data":"d60c04368ca1f7271c3d9b332f331e4a974d4c2545c0b801496cada1b2bf8491"} Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.120557 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60c04368ca1f7271c3d9b332f331e4a974d4c2545c0b801496cada1b2bf8491" Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.119951 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561818-l6l4d" Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.122053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" event={"ID":"5e8b58f4-3072-450a-afae-2d18d9f34848","Type":"ContainerStarted","Data":"802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0"} Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.635207 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561812-fx979"] Mar 17 00:58:05 crc kubenswrapper[4755]: I0317 00:58:05.643757 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561812-fx979"] Mar 17 00:58:06 crc kubenswrapper[4755]: I0317 00:58:06.135488 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" event={"ID":"5e8b58f4-3072-450a-afae-2d18d9f34848","Type":"ContainerStarted","Data":"4ffb41207329bc493c506d401a34a469e551a8f2e864aa30c6cfd892d7c7b349"} Mar 17 00:58:06 crc kubenswrapper[4755]: I0317 00:58:06.167109 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" podStartSLOduration=1.5939117999999999 podStartE2EDuration="2.167089602s" podCreationTimestamp="2026-03-17 00:58:04 +0000 UTC" firstStartedPulling="2026-03-17 00:58:04.919878334 +0000 UTC m=+2159.679330617" lastFinishedPulling="2026-03-17 00:58:05.493056126 +0000 UTC m=+2160.252508419" observedRunningTime="2026-03-17 00:58:06.153450211 +0000 UTC m=+2160.912902494" watchObservedRunningTime="2026-03-17 00:58:06.167089602 +0000 UTC m=+2160.926541895" Mar 17 00:58:06 crc kubenswrapper[4755]: I0317 00:58:06.273131 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3" path="/var/lib/kubelet/pods/5d74dc4e-2bce-4c53-b3ce-c1cfce3937c3/volumes" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.203090 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:28 crc kubenswrapper[4755]: E0317 00:58:28.204192 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec04782-7ac5-4e8c-92a6-43843b88db96" containerName="oc" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.204210 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec04782-7ac5-4e8c-92a6-43843b88db96" containerName="oc" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.204473 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec04782-7ac5-4e8c-92a6-43843b88db96" containerName="oc" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.206568 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.215422 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.375108 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.375204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.375495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bk5\" (UniqueName: \"kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.477349 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.477427 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bk5\" (UniqueName: \"kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.477557 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.478004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.478043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.507845 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bk5\" (UniqueName: \"kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5\") pod \"certified-operators-jnvwx\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.526474 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.669203 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.669518 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.669864 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.670464 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 00:58:28 crc kubenswrapper[4755]: I0317 00:58:28.670541 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b" gracePeriod=600 Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.056495 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:29 crc kubenswrapper[4755]: W0317 00:58:29.057631 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a3d0e5_026e_4c27_a489_8b4d83ace906.slice/crio-6b363f7d8e02f7e102b571a91d80152b86686677da99ce97e999496c6d9fef7b WatchSource:0}: Error finding container 6b363f7d8e02f7e102b571a91d80152b86686677da99ce97e999496c6d9fef7b: Status 404 returned error can't find the container with id 6b363f7d8e02f7e102b571a91d80152b86686677da99ce97e999496c6d9fef7b Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.586942 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b" exitCode=0 Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.587001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b"} Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.587354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420"} Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.587374 4755 scope.go:117] "RemoveContainer" containerID="df8c10161372a9bb5d2feffe63fd3b1232889c9c75c37670693a0154e9601834" Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.590699 4755 generic.go:334] "Generic (PLEG): container finished" podID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerID="2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3" exitCode=0 Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.590736 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerDied","Data":"2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3"} Mar 17 00:58:29 crc kubenswrapper[4755]: I0317 00:58:29.590753 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerStarted","Data":"6b363f7d8e02f7e102b571a91d80152b86686677da99ce97e999496c6d9fef7b"} Mar 17 00:58:30 crc kubenswrapper[4755]: I0317 00:58:30.614949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerStarted","Data":"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e"} Mar 17 00:58:32 crc kubenswrapper[4755]: I0317 00:58:32.640168 4755 generic.go:334] "Generic (PLEG): container finished" podID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerID="885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e" exitCode=0 Mar 17 00:58:32 crc kubenswrapper[4755]: I0317 00:58:32.640281 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerDied","Data":"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e"} Mar 17 00:58:33 crc kubenswrapper[4755]: I0317 00:58:33.652794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerStarted","Data":"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f"} Mar 17 00:58:33 crc kubenswrapper[4755]: I0317 00:58:33.668519 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jnvwx" podStartSLOduration=2.061870912 podStartE2EDuration="5.668506604s" podCreationTimestamp="2026-03-17 00:58:28 +0000 UTC" firstStartedPulling="2026-03-17 00:58:29.592949126 +0000 UTC m=+2184.352401409" lastFinishedPulling="2026-03-17 00:58:33.199584778 +0000 UTC m=+2187.959037101" observedRunningTime="2026-03-17 00:58:33.667909897 +0000 UTC m=+2188.427362180" watchObservedRunningTime="2026-03-17 00:58:33.668506604 +0000 UTC m=+2188.427958887" Mar 17 00:58:38 crc kubenswrapper[4755]: I0317 00:58:38.527293 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:38 crc kubenswrapper[4755]: I0317 00:58:38.527830 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:38 crc kubenswrapper[4755]: I0317 00:58:38.602094 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:38 crc kubenswrapper[4755]: I0317 00:58:38.757762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:38 crc kubenswrapper[4755]: I0317 00:58:38.846200 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:40 crc kubenswrapper[4755]: I0317 00:58:40.724245 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jnvwx" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="registry-server" containerID="cri-o://2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f" gracePeriod=2 Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.366464 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.494776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content\") pod \"81a3d0e5-026e-4c27-a489-8b4d83ace906\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.494901 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bk5\" (UniqueName: \"kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5\") pod \"81a3d0e5-026e-4c27-a489-8b4d83ace906\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.495278 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities\") pod \"81a3d0e5-026e-4c27-a489-8b4d83ace906\" (UID: \"81a3d0e5-026e-4c27-a489-8b4d83ace906\") " Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.496195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities" (OuterVolumeSpecName: "utilities") pod "81a3d0e5-026e-4c27-a489-8b4d83ace906" (UID: "81a3d0e5-026e-4c27-a489-8b4d83ace906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.501945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5" (OuterVolumeSpecName: "kube-api-access-h7bk5") pod "81a3d0e5-026e-4c27-a489-8b4d83ace906" (UID: "81a3d0e5-026e-4c27-a489-8b4d83ace906"). InnerVolumeSpecName "kube-api-access-h7bk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.581104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81a3d0e5-026e-4c27-a489-8b4d83ace906" (UID: "81a3d0e5-026e-4c27-a489-8b4d83ace906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.598368 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bk5\" (UniqueName: \"kubernetes.io/projected/81a3d0e5-026e-4c27-a489-8b4d83ace906-kube-api-access-h7bk5\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.598417 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.598429 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81a3d0e5-026e-4c27-a489-8b4d83ace906-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.738839 4755 generic.go:334] "Generic (PLEG): container finished" podID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerID="2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f" exitCode=0 Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.738901 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerDied","Data":"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f"} Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.738946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnvwx" event={"ID":"81a3d0e5-026e-4c27-a489-8b4d83ace906","Type":"ContainerDied","Data":"6b363f7d8e02f7e102b571a91d80152b86686677da99ce97e999496c6d9fef7b"} Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.738955 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnvwx" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.739000 4755 scope.go:117] "RemoveContainer" containerID="2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.783492 4755 scope.go:117] "RemoveContainer" containerID="885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.789945 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.816960 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jnvwx"] Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.824920 4755 scope.go:117] "RemoveContainer" containerID="2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.874135 4755 scope.go:117] "RemoveContainer" containerID="2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f" Mar 17 00:58:41 crc kubenswrapper[4755]: E0317 00:58:41.874777 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f\": container with ID starting with 2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f not found: ID does not exist" containerID="2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.874848 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f"} err="failed to get container status \"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f\": rpc error: code = NotFound desc = could not find container \"2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f\": container with ID starting with 2568d7f41d5ac4ca0d30cc9f50ca065d0ef43e4d9bfc0afe4316d401c1ab060f not found: ID does not exist" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.874887 4755 scope.go:117] "RemoveContainer" containerID="885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e" Mar 17 00:58:41 crc kubenswrapper[4755]: E0317 00:58:41.875324 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e\": container with ID starting with 885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e not found: ID does not exist" containerID="885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.875361 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e"} err="failed to get container status \"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e\": rpc error: code = NotFound desc = could not find container \"885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e\": container with ID starting with 885a7f16fc2ca8d45007a90cc1c663083b1178d886dbec51c805b6d2ec640e5e not found: ID does not exist" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.875387 4755 scope.go:117] "RemoveContainer" containerID="2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3" Mar 17 00:58:41 crc kubenswrapper[4755]: E0317 00:58:41.875871 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3\": container with ID starting with 2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3 not found: ID does not exist" containerID="2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3" Mar 17 00:58:41 crc kubenswrapper[4755]: I0317 00:58:41.875934 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3"} err="failed to get container status \"2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3\": rpc error: code = NotFound desc = could not find container \"2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3\": container with ID starting with 2b9caa1315de1cf363c965e30c786fc69a8ce1d101df4f9399180f200d732ab3 not found: ID does not exist" Mar 17 00:58:42 crc kubenswrapper[4755]: I0317 00:58:42.266588 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" path="/var/lib/kubelet/pods/81a3d0e5-026e-4c27-a489-8b4d83ace906/volumes" Mar 17 00:58:59 crc kubenswrapper[4755]: I0317 00:58:59.607419 4755 scope.go:117] "RemoveContainer" containerID="8e88ed50bc220ab48095cae2a293f1c6ecf6d643fe25dc678eac66a657411f4e" Mar 17 00:59:16 crc kubenswrapper[4755]: I0317 00:59:16.134742 4755 generic.go:334] "Generic (PLEG): container finished" podID="5e8b58f4-3072-450a-afae-2d18d9f34848" containerID="4ffb41207329bc493c506d401a34a469e551a8f2e864aa30c6cfd892d7c7b349" exitCode=0 Mar 17 00:59:16 crc kubenswrapper[4755]: I0317 00:59:16.134822 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" event={"ID":"5e8b58f4-3072-450a-afae-2d18d9f34848","Type":"ContainerDied","Data":"4ffb41207329bc493c506d401a34a469e551a8f2e864aa30c6cfd892d7c7b349"} Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.640017 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.741119 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0\") pod \"5e8b58f4-3072-450a-afae-2d18d9f34848\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.741210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory\") pod \"5e8b58f4-3072-450a-afae-2d18d9f34848\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.741242 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam\") pod \"5e8b58f4-3072-450a-afae-2d18d9f34848\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.741398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle\") pod \"5e8b58f4-3072-450a-afae-2d18d9f34848\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.741522 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz5w5\" (UniqueName: \"kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5\") pod \"5e8b58f4-3072-450a-afae-2d18d9f34848\" (UID: \"5e8b58f4-3072-450a-afae-2d18d9f34848\") " Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.747273 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5e8b58f4-3072-450a-afae-2d18d9f34848" (UID: "5e8b58f4-3072-450a-afae-2d18d9f34848"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.747674 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5" (OuterVolumeSpecName: "kube-api-access-hz5w5") pod "5e8b58f4-3072-450a-afae-2d18d9f34848" (UID: "5e8b58f4-3072-450a-afae-2d18d9f34848"). InnerVolumeSpecName "kube-api-access-hz5w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.773458 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory" (OuterVolumeSpecName: "inventory") pod "5e8b58f4-3072-450a-afae-2d18d9f34848" (UID: "5e8b58f4-3072-450a-afae-2d18d9f34848"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.774539 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e8b58f4-3072-450a-afae-2d18d9f34848" (UID: "5e8b58f4-3072-450a-afae-2d18d9f34848"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.776301 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5e8b58f4-3072-450a-afae-2d18d9f34848" (UID: "5e8b58f4-3072-450a-afae-2d18d9f34848"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.843534 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz5w5\" (UniqueName: \"kubernetes.io/projected/5e8b58f4-3072-450a-afae-2d18d9f34848-kube-api-access-hz5w5\") on node \"crc\" DevicePath \"\"" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.843575 4755 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8b58f4-3072-450a-afae-2d18d9f34848-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.843604 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.843617 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 00:59:17 crc kubenswrapper[4755]: I0317 00:59:17.843628 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8b58f4-3072-450a-afae-2d18d9f34848-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.164029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" event={"ID":"5e8b58f4-3072-450a-afae-2d18d9f34848","Type":"ContainerDied","Data":"802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0"} Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.164085 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="802b748880f518686cafa0be9f387b2498b781fe44c740594e10df71186aa0f0" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.164139 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.314818 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9"] Mar 17 00:59:18 crc kubenswrapper[4755]: E0317 00:59:18.315288 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="registry-server" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315315 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="registry-server" Mar 17 00:59:18 crc kubenswrapper[4755]: E0317 00:59:18.315341 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="extract-utilities" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315350 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="extract-utilities" Mar 17 00:59:18 crc kubenswrapper[4755]: E0317 00:59:18.315378 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8b58f4-3072-450a-afae-2d18d9f34848" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315386 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8b58f4-3072-450a-afae-2d18d9f34848" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 00:59:18 crc kubenswrapper[4755]: E0317 00:59:18.315410 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="extract-content" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315419 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="extract-content" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315733 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a3d0e5-026e-4c27-a489-8b4d83ace906" containerName="registry-server" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.315758 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8b58f4-3072-450a-afae-2d18d9f34848" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.316626 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.320744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.320965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.321130 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.321416 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.321423 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.336140 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9"] Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.355775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.356103 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.356185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.356223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.356575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnzz9\" (UniqueName: \"kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.459201 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnzz9\" (UniqueName: \"kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.459570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.459612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.459661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.461010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.464412 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.465107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.466101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.478528 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.485200 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnzz9\" (UniqueName: \"kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:18 crc kubenswrapper[4755]: I0317 00:59:18.641146 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 00:59:19 crc kubenswrapper[4755]: I0317 00:59:19.339949 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9"] Mar 17 00:59:20 crc kubenswrapper[4755]: I0317 00:59:20.188897 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" event={"ID":"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287","Type":"ContainerStarted","Data":"c7663c2979e2d3a0d39f2275d30faa4d4e8307d76f5b4685af29145d2f861e9d"} Mar 17 00:59:21 crc kubenswrapper[4755]: I0317 00:59:21.205769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" event={"ID":"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287","Type":"ContainerStarted","Data":"d6bea1b8db6d043636a77e356d58b2f74f2aa3ece2c18eac6a510df385d2f3be"} Mar 17 00:59:21 crc kubenswrapper[4755]: I0317 00:59:21.230798 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" podStartSLOduration=2.637467818 podStartE2EDuration="3.230774118s" podCreationTimestamp="2026-03-17 00:59:18 +0000 UTC" firstStartedPulling="2026-03-17 00:59:19.352073772 +0000 UTC m=+2234.111526055" lastFinishedPulling="2026-03-17 00:59:19.945380042 +0000 UTC m=+2234.704832355" observedRunningTime="2026-03-17 00:59:21.229300288 +0000 UTC m=+2235.988752611" watchObservedRunningTime="2026-03-17 00:59:21.230774118 +0000 UTC m=+2235.990226431" Mar 17 00:59:38 crc kubenswrapper[4755]: I0317 00:59:38.063150 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-c6hnq"] Mar 17 00:59:38 crc kubenswrapper[4755]: I0317 00:59:38.072105 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-c6hnq"] Mar 17 00:59:38 crc kubenswrapper[4755]: I0317 00:59:38.264595 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6021f21e-d701-4372-aac5-e70591f71906" path="/var/lib/kubelet/pods/6021f21e-d701-4372-aac5-e70591f71906/volumes" Mar 17 00:59:59 crc kubenswrapper[4755]: I0317 00:59:59.799204 4755 scope.go:117] "RemoveContainer" containerID="63cff2652495ebfc7a80c379d9d370b7cfc8d9468d114d18fe4c328348131366" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.154466 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561820-j9wxp"] Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.156623 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.158507 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.159362 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.160141 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.174752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561820-j9wxp"] Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.188470 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf"] Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.190177 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.215206 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.216551 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.235312 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf"] Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.282447 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.282516 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvnxx\" (UniqueName: \"kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.282590 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvv8t\" (UniqueName: \"kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t\") pod \"auto-csr-approver-29561820-j9wxp\" (UID: \"b5238f7e-f046-4dbb-9433-78786791d6cd\") " pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.282649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.384352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvnxx\" (UniqueName: \"kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.384511 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvv8t\" (UniqueName: \"kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t\") pod \"auto-csr-approver-29561820-j9wxp\" (UID: \"b5238f7e-f046-4dbb-9433-78786791d6cd\") " pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.384642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.384817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.386399 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.392379 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.402533 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvnxx\" (UniqueName: \"kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx\") pod \"collect-profiles-29561820-4z5nf\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.416173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvv8t\" (UniqueName: \"kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t\") pod \"auto-csr-approver-29561820-j9wxp\" (UID: \"b5238f7e-f046-4dbb-9433-78786791d6cd\") " pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.529302 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:00 crc kubenswrapper[4755]: I0317 01:00:00.545367 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:01 crc kubenswrapper[4755]: I0317 01:00:01.107888 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf"] Mar 17 01:00:01 crc kubenswrapper[4755]: I0317 01:00:01.126096 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561820-j9wxp"] Mar 17 01:00:01 crc kubenswrapper[4755]: W0317 01:00:01.146724 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5238f7e_f046_4dbb_9433_78786791d6cd.slice/crio-f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2 WatchSource:0}: Error finding container f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2: Status 404 returned error can't find the container with id f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2 Mar 17 01:00:01 crc kubenswrapper[4755]: I0317 01:00:01.768873 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" event={"ID":"f36aaeb7-f107-43d2-9e43-11378467f808","Type":"ContainerStarted","Data":"ad9810ca0a1786673aacf6309ef48c0e2ec3687c8e8272f23c3fe3081066f72f"} Mar 17 01:00:01 crc kubenswrapper[4755]: I0317 01:00:01.769239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" event={"ID":"f36aaeb7-f107-43d2-9e43-11378467f808","Type":"ContainerStarted","Data":"a1921fefaa0419a26aa110193a2d8597531eb81eb7b498241b55fcb258dd5d3f"} Mar 17 01:00:01 crc kubenswrapper[4755]: I0317 01:00:01.770604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" event={"ID":"b5238f7e-f046-4dbb-9433-78786791d6cd","Type":"ContainerStarted","Data":"f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2"} Mar 17 01:00:02 crc kubenswrapper[4755]: I0317 01:00:02.785394 4755 generic.go:334] "Generic (PLEG): container finished" podID="f36aaeb7-f107-43d2-9e43-11378467f808" containerID="ad9810ca0a1786673aacf6309ef48c0e2ec3687c8e8272f23c3fe3081066f72f" exitCode=0 Mar 17 01:00:02 crc kubenswrapper[4755]: I0317 01:00:02.785560 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" event={"ID":"f36aaeb7-f107-43d2-9e43-11378467f808","Type":"ContainerDied","Data":"ad9810ca0a1786673aacf6309ef48c0e2ec3687c8e8272f23c3fe3081066f72f"} Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.267896 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.382677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume\") pod \"f36aaeb7-f107-43d2-9e43-11378467f808\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.383088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume\") pod \"f36aaeb7-f107-43d2-9e43-11378467f808\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.383164 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvnxx\" (UniqueName: \"kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx\") pod \"f36aaeb7-f107-43d2-9e43-11378467f808\" (UID: \"f36aaeb7-f107-43d2-9e43-11378467f808\") " Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.384184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume" (OuterVolumeSpecName: "config-volume") pod "f36aaeb7-f107-43d2-9e43-11378467f808" (UID: "f36aaeb7-f107-43d2-9e43-11378467f808"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.388989 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx" (OuterVolumeSpecName: "kube-api-access-xvnxx") pod "f36aaeb7-f107-43d2-9e43-11378467f808" (UID: "f36aaeb7-f107-43d2-9e43-11378467f808"). InnerVolumeSpecName "kube-api-access-xvnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.389770 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f36aaeb7-f107-43d2-9e43-11378467f808" (UID: "f36aaeb7-f107-43d2-9e43-11378467f808"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.486211 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f36aaeb7-f107-43d2-9e43-11378467f808-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.486248 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f36aaeb7-f107-43d2-9e43-11378467f808-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.486262 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvnxx\" (UniqueName: \"kubernetes.io/projected/f36aaeb7-f107-43d2-9e43-11378467f808-kube-api-access-xvnxx\") on node \"crc\" DevicePath \"\"" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.812524 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" event={"ID":"f36aaeb7-f107-43d2-9e43-11378467f808","Type":"ContainerDied","Data":"a1921fefaa0419a26aa110193a2d8597531eb81eb7b498241b55fcb258dd5d3f"} Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.812868 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1921fefaa0419a26aa110193a2d8597531eb81eb7b498241b55fcb258dd5d3f" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.813013 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.836002 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" podStartSLOduration=1.567706915 podStartE2EDuration="4.835977521s" podCreationTimestamp="2026-03-17 01:00:00 +0000 UTC" firstStartedPulling="2026-03-17 01:00:01.165479412 +0000 UTC m=+2275.924931695" lastFinishedPulling="2026-03-17 01:00:04.433750018 +0000 UTC m=+2279.193202301" observedRunningTime="2026-03-17 01:00:04.830970394 +0000 UTC m=+2279.590422687" watchObservedRunningTime="2026-03-17 01:00:04.835977521 +0000 UTC m=+2279.595429834" Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.902423 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj"] Mar 17 01:00:04 crc kubenswrapper[4755]: I0317 01:00:04.910659 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561775-fpcjj"] Mar 17 01:00:05 crc kubenswrapper[4755]: I0317 01:00:05.826813 4755 generic.go:334] "Generic (PLEG): container finished" podID="b5238f7e-f046-4dbb-9433-78786791d6cd" containerID="3bb60e4b9a3eb3996650943fc22d8597d874a9c45e38bfbe9fae28c5b1430458" exitCode=0 Mar 17 01:00:05 crc kubenswrapper[4755]: I0317 01:00:05.826904 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" event={"ID":"b5238f7e-f046-4dbb-9433-78786791d6cd","Type":"ContainerDied","Data":"3bb60e4b9a3eb3996650943fc22d8597d874a9c45e38bfbe9fae28c5b1430458"} Mar 17 01:00:06 crc kubenswrapper[4755]: I0317 01:00:06.274325 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcd13b3-4ede-42eb-8b04-b2d572f7f64c" path="/var/lib/kubelet/pods/cfcd13b3-4ede-42eb-8b04-b2d572f7f64c/volumes" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.284552 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.373725 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvv8t\" (UniqueName: \"kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t\") pod \"b5238f7e-f046-4dbb-9433-78786791d6cd\" (UID: \"b5238f7e-f046-4dbb-9433-78786791d6cd\") " Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.382880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t" (OuterVolumeSpecName: "kube-api-access-dvv8t") pod "b5238f7e-f046-4dbb-9433-78786791d6cd" (UID: "b5238f7e-f046-4dbb-9433-78786791d6cd"). InnerVolumeSpecName "kube-api-access-dvv8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.477087 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvv8t\" (UniqueName: \"kubernetes.io/projected/b5238f7e-f046-4dbb-9433-78786791d6cd-kube-api-access-dvv8t\") on node \"crc\" DevicePath \"\"" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.847600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" event={"ID":"b5238f7e-f046-4dbb-9433-78786791d6cd","Type":"ContainerDied","Data":"f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2"} Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.847872 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f611e7f1d1c9b54a0eb5864730954f567e155f84319e5a062d0d500bc3b696f2" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.847667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561820-j9wxp" Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.917331 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561814-v79pm"] Mar 17 01:00:07 crc kubenswrapper[4755]: I0317 01:00:07.926572 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561814-v79pm"] Mar 17 01:00:08 crc kubenswrapper[4755]: I0317 01:00:08.275131 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9477c448-33bb-4e17-8f19-098bc4e134e4" path="/var/lib/kubelet/pods/9477c448-33bb-4e17-8f19-098bc4e134e4/volumes" Mar 17 01:00:21 crc kubenswrapper[4755]: I0317 01:00:21.103874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-q9ck4"] Mar 17 01:00:21 crc kubenswrapper[4755]: I0317 01:00:21.114774 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-q9ck4"] Mar 17 01:00:22 crc kubenswrapper[4755]: I0317 01:00:22.269981 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3329d5af-f897-4463-b17d-cbe601800d3c" path="/var/lib/kubelet/pods/3329d5af-f897-4463-b17d-cbe601800d3c/volumes" Mar 17 01:00:58 crc kubenswrapper[4755]: I0317 01:00:58.664702 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:00:58 crc kubenswrapper[4755]: I0317 01:00:58.665340 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:00:59 crc kubenswrapper[4755]: I0317 01:00:59.919218 4755 scope.go:117] "RemoveContainer" containerID="9a654e3714e51944c58cac1513b653fbf3b4c1dad674e0d21acc4fc6a5054c7e" Mar 17 01:00:59 crc kubenswrapper[4755]: I0317 01:00:59.989775 4755 scope.go:117] "RemoveContainer" containerID="d71b4e9571c10e4c34ea754155f6f4cc99d55b9dfd5691b348f19d0356ad3102" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.060032 4755 scope.go:117] "RemoveContainer" containerID="70e4c1def5018c3b2f625393e2a282b6b3c37f06d9f5aa26002edce2d6f2d7e1" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.190644 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29561821-f7mn2"] Mar 17 01:01:00 crc kubenswrapper[4755]: E0317 01:01:00.191177 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36aaeb7-f107-43d2-9e43-11378467f808" containerName="collect-profiles" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.191195 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36aaeb7-f107-43d2-9e43-11378467f808" containerName="collect-profiles" Mar 17 01:01:00 crc kubenswrapper[4755]: E0317 01:01:00.191211 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5238f7e-f046-4dbb-9433-78786791d6cd" containerName="oc" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.191219 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5238f7e-f046-4dbb-9433-78786791d6cd" containerName="oc" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.191400 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5238f7e-f046-4dbb-9433-78786791d6cd" containerName="oc" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.191412 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36aaeb7-f107-43d2-9e43-11378467f808" containerName="collect-profiles" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.192166 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.202150 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561821-f7mn2"] Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.334762 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.335059 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.335155 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.335243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq4m\" (UniqueName: \"kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.436989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.437690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.437977 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.438220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq4m\" (UniqueName: \"kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.447004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.447269 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.447517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.463193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq4m\" (UniqueName: \"kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m\") pod \"keystone-cron-29561821-f7mn2\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.547563 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:00 crc kubenswrapper[4755]: W0317 01:01:00.866799 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa97c87e_b133_48bb_af65_092be28ffca7.slice/crio-0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee WatchSource:0}: Error finding container 0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee: Status 404 returned error can't find the container with id 0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee Mar 17 01:01:00 crc kubenswrapper[4755]: I0317 01:01:00.869090 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561821-f7mn2"] Mar 17 01:01:01 crc kubenswrapper[4755]: I0317 01:01:01.544666 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561821-f7mn2" event={"ID":"fa97c87e-b133-48bb-af65-092be28ffca7","Type":"ContainerStarted","Data":"8d7ed69e060ad9ed3d676d183c9c1671cd04383c22529fce0bb537678d95dbeb"} Mar 17 01:01:01 crc kubenswrapper[4755]: I0317 01:01:01.545635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561821-f7mn2" event={"ID":"fa97c87e-b133-48bb-af65-092be28ffca7","Type":"ContainerStarted","Data":"0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee"} Mar 17 01:01:01 crc kubenswrapper[4755]: I0317 01:01:01.589404 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29561821-f7mn2" podStartSLOduration=1.589376734 podStartE2EDuration="1.589376734s" podCreationTimestamp="2026-03-17 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:01:01.576288247 +0000 UTC m=+2336.335740570" watchObservedRunningTime="2026-03-17 01:01:01.589376734 +0000 UTC m=+2336.348829057" Mar 17 01:01:03 crc kubenswrapper[4755]: I0317 01:01:03.578286 4755 generic.go:334] "Generic (PLEG): container finished" podID="fa97c87e-b133-48bb-af65-092be28ffca7" containerID="8d7ed69e060ad9ed3d676d183c9c1671cd04383c22529fce0bb537678d95dbeb" exitCode=0 Mar 17 01:01:03 crc kubenswrapper[4755]: I0317 01:01:03.578392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561821-f7mn2" event={"ID":"fa97c87e-b133-48bb-af65-092be28ffca7","Type":"ContainerDied","Data":"8d7ed69e060ad9ed3d676d183c9c1671cd04383c22529fce0bb537678d95dbeb"} Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.096980 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.263412 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle\") pod \"fa97c87e-b133-48bb-af65-092be28ffca7\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.263691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data\") pod \"fa97c87e-b133-48bb-af65-092be28ffca7\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.263732 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xq4m\" (UniqueName: \"kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m\") pod \"fa97c87e-b133-48bb-af65-092be28ffca7\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.263930 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys\") pod \"fa97c87e-b133-48bb-af65-092be28ffca7\" (UID: \"fa97c87e-b133-48bb-af65-092be28ffca7\") " Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.278793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa97c87e-b133-48bb-af65-092be28ffca7" (UID: "fa97c87e-b133-48bb-af65-092be28ffca7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.278822 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m" (OuterVolumeSpecName: "kube-api-access-8xq4m") pod "fa97c87e-b133-48bb-af65-092be28ffca7" (UID: "fa97c87e-b133-48bb-af65-092be28ffca7"). InnerVolumeSpecName "kube-api-access-8xq4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.291781 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa97c87e-b133-48bb-af65-092be28ffca7" (UID: "fa97c87e-b133-48bb-af65-092be28ffca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.315305 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data" (OuterVolumeSpecName: "config-data") pod "fa97c87e-b133-48bb-af65-092be28ffca7" (UID: "fa97c87e-b133-48bb-af65-092be28ffca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.366877 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.366918 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.366934 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa97c87e-b133-48bb-af65-092be28ffca7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.366945 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xq4m\" (UniqueName: \"kubernetes.io/projected/fa97c87e-b133-48bb-af65-092be28ffca7-kube-api-access-8xq4m\") on node \"crc\" DevicePath \"\"" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.604779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561821-f7mn2" event={"ID":"fa97c87e-b133-48bb-af65-092be28ffca7","Type":"ContainerDied","Data":"0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee"} Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.605013 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f06130c4df5501ab13878572f02d393c60d8fdf710fce561c218560237fceee" Mar 17 01:01:05 crc kubenswrapper[4755]: I0317 01:01:05.604826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561821-f7mn2" Mar 17 01:01:28 crc kubenswrapper[4755]: I0317 01:01:28.665088 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:01:28 crc kubenswrapper[4755]: I0317 01:01:28.665707 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:01:58 crc kubenswrapper[4755]: I0317 01:01:58.665571 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:01:58 crc kubenswrapper[4755]: I0317 01:01:58.666368 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:01:58 crc kubenswrapper[4755]: I0317 01:01:58.666488 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:01:58 crc kubenswrapper[4755]: I0317 01:01:58.667307 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:01:58 crc kubenswrapper[4755]: I0317 01:01:58.667398 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" gracePeriod=600 Mar 17 01:01:58 crc kubenswrapper[4755]: E0317 01:01:58.817489 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:01:59 crc kubenswrapper[4755]: I0317 01:01:59.305364 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" exitCode=0 Mar 17 01:01:59 crc kubenswrapper[4755]: I0317 01:01:59.305465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420"} Mar 17 01:01:59 crc kubenswrapper[4755]: I0317 01:01:59.305666 4755 scope.go:117] "RemoveContainer" containerID="26c70aeb78d7c43cc9f41f95cdbae738bd0d561c87ae8d0e246e95277b78f86b" Mar 17 01:01:59 crc kubenswrapper[4755]: I0317 01:01:59.306403 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:01:59 crc kubenswrapper[4755]: E0317 01:01:59.306703 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.160878 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561822-7nn47"] Mar 17 01:02:00 crc kubenswrapper[4755]: E0317 01:02:00.161451 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa97c87e-b133-48bb-af65-092be28ffca7" containerName="keystone-cron" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.161466 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa97c87e-b133-48bb-af65-092be28ffca7" containerName="keystone-cron" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.161728 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa97c87e-b133-48bb-af65-092be28ffca7" containerName="keystone-cron" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.162701 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.165550 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.165765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.169575 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.178027 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561822-7nn47"] Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.197954 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nmw\" (UniqueName: \"kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw\") pod \"auto-csr-approver-29561822-7nn47\" (UID: \"8485cade-b73d-435d-8038-5ca6280c2e12\") " pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.299998 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nmw\" (UniqueName: \"kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw\") pod \"auto-csr-approver-29561822-7nn47\" (UID: \"8485cade-b73d-435d-8038-5ca6280c2e12\") " pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.344499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nmw\" (UniqueName: \"kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw\") pod \"auto-csr-approver-29561822-7nn47\" (UID: \"8485cade-b73d-435d-8038-5ca6280c2e12\") " pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:00 crc kubenswrapper[4755]: I0317 01:02:00.495672 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:01 crc kubenswrapper[4755]: I0317 01:02:01.062048 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561822-7nn47"] Mar 17 01:02:01 crc kubenswrapper[4755]: W0317 01:02:01.068151 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8485cade_b73d_435d_8038_5ca6280c2e12.slice/crio-00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534 WatchSource:0}: Error finding container 00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534: Status 404 returned error can't find the container with id 00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534 Mar 17 01:02:01 crc kubenswrapper[4755]: I0317 01:02:01.071125 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:02:01 crc kubenswrapper[4755]: I0317 01:02:01.329625 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561822-7nn47" event={"ID":"8485cade-b73d-435d-8038-5ca6280c2e12","Type":"ContainerStarted","Data":"00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534"} Mar 17 01:02:03 crc kubenswrapper[4755]: I0317 01:02:03.355412 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561822-7nn47" event={"ID":"8485cade-b73d-435d-8038-5ca6280c2e12","Type":"ContainerStarted","Data":"337db9d6809a64af207530abcb6bae766436065e76958823fe2fee6fd77cda4b"} Mar 17 01:02:03 crc kubenswrapper[4755]: I0317 01:02:03.388214 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561822-7nn47" podStartSLOduration=1.557981904 podStartE2EDuration="3.388193386s" podCreationTimestamp="2026-03-17 01:02:00 +0000 UTC" firstStartedPulling="2026-03-17 01:02:01.070573681 +0000 UTC m=+2395.830026004" lastFinishedPulling="2026-03-17 01:02:02.900785173 +0000 UTC m=+2397.660237486" observedRunningTime="2026-03-17 01:02:03.379590656 +0000 UTC m=+2398.139042939" watchObservedRunningTime="2026-03-17 01:02:03.388193386 +0000 UTC m=+2398.147645669" Mar 17 01:02:04 crc kubenswrapper[4755]: I0317 01:02:04.371338 4755 generic.go:334] "Generic (PLEG): container finished" podID="8485cade-b73d-435d-8038-5ca6280c2e12" containerID="337db9d6809a64af207530abcb6bae766436065e76958823fe2fee6fd77cda4b" exitCode=0 Mar 17 01:02:04 crc kubenswrapper[4755]: I0317 01:02:04.371431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561822-7nn47" event={"ID":"8485cade-b73d-435d-8038-5ca6280c2e12","Type":"ContainerDied","Data":"337db9d6809a64af207530abcb6bae766436065e76958823fe2fee6fd77cda4b"} Mar 17 01:02:05 crc kubenswrapper[4755]: I0317 01:02:05.866794 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.039343 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5nmw\" (UniqueName: \"kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw\") pod \"8485cade-b73d-435d-8038-5ca6280c2e12\" (UID: \"8485cade-b73d-435d-8038-5ca6280c2e12\") " Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.047726 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw" (OuterVolumeSpecName: "kube-api-access-w5nmw") pod "8485cade-b73d-435d-8038-5ca6280c2e12" (UID: "8485cade-b73d-435d-8038-5ca6280c2e12"). InnerVolumeSpecName "kube-api-access-w5nmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.142585 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5nmw\" (UniqueName: \"kubernetes.io/projected/8485cade-b73d-435d-8038-5ca6280c2e12-kube-api-access-w5nmw\") on node \"crc\" DevicePath \"\"" Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.410355 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561822-7nn47" event={"ID":"8485cade-b73d-435d-8038-5ca6280c2e12","Type":"ContainerDied","Data":"00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534"} Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.410583 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f5b2e99c7c956aaa26c8996fba629ac1392a926b9a5205a7f517f048fdc534" Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.410462 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561822-7nn47" Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.497696 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561816-vqpcs"] Mar 17 01:02:06 crc kubenswrapper[4755]: I0317 01:02:06.514424 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561816-vqpcs"] Mar 17 01:02:08 crc kubenswrapper[4755]: I0317 01:02:08.272563 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af91a237-9ed9-4565-95e7-f8238b07acb6" path="/var/lib/kubelet/pods/af91a237-9ed9-4565-95e7-f8238b07acb6/volumes" Mar 17 01:02:12 crc kubenswrapper[4755]: I0317 01:02:12.249183 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:02:12 crc kubenswrapper[4755]: E0317 01:02:12.250104 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:02:15 crc kubenswrapper[4755]: I0317 01:02:15.868903 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:15 crc kubenswrapper[4755]: E0317 01:02:15.874479 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8485cade-b73d-435d-8038-5ca6280c2e12" containerName="oc" Mar 17 01:02:15 crc kubenswrapper[4755]: I0317 01:02:15.874502 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8485cade-b73d-435d-8038-5ca6280c2e12" containerName="oc" Mar 17 01:02:15 crc kubenswrapper[4755]: I0317 01:02:15.874820 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8485cade-b73d-435d-8038-5ca6280c2e12" containerName="oc" Mar 17 01:02:15 crc kubenswrapper[4755]: I0317 01:02:15.914310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:15 crc kubenswrapper[4755]: I0317 01:02:15.929839 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.018267 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.018504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.018576 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9q49\" (UniqueName: \"kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.120515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.120809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9q49\" (UniqueName: \"kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.120969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.121034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.121403 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.141148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9q49\" (UniqueName: \"kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49\") pod \"redhat-operators-xjp5g\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.244708 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:16 crc kubenswrapper[4755]: I0317 01:02:16.691198 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:17 crc kubenswrapper[4755]: I0317 01:02:17.543633 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerDied","Data":"99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0"} Mar 17 01:02:17 crc kubenswrapper[4755]: I0317 01:02:17.543591 4755 generic.go:334] "Generic (PLEG): container finished" podID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerID="99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0" exitCode=0 Mar 17 01:02:17 crc kubenswrapper[4755]: I0317 01:02:17.544100 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerStarted","Data":"a06b2c44aa0a2f986eb6ba49b559807b6f47b1a144112165f3734d57942e7f6c"} Mar 17 01:02:19 crc kubenswrapper[4755]: I0317 01:02:19.567354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerStarted","Data":"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2"} Mar 17 01:02:24 crc kubenswrapper[4755]: I0317 01:02:24.625909 4755 generic.go:334] "Generic (PLEG): container finished" podID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerID="418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2" exitCode=0 Mar 17 01:02:24 crc kubenswrapper[4755]: I0317 01:02:24.625975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerDied","Data":"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2"} Mar 17 01:02:25 crc kubenswrapper[4755]: I0317 01:02:25.635771 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerStarted","Data":"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71"} Mar 17 01:02:25 crc kubenswrapper[4755]: I0317 01:02:25.667735 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xjp5g" podStartSLOduration=3.136953993 podStartE2EDuration="10.66771746s" podCreationTimestamp="2026-03-17 01:02:15 +0000 UTC" firstStartedPulling="2026-03-17 01:02:17.546632584 +0000 UTC m=+2412.306084867" lastFinishedPulling="2026-03-17 01:02:25.077396041 +0000 UTC m=+2419.836848334" observedRunningTime="2026-03-17 01:02:25.664684369 +0000 UTC m=+2420.424136652" watchObservedRunningTime="2026-03-17 01:02:25.66771746 +0000 UTC m=+2420.427169733" Mar 17 01:02:26 crc kubenswrapper[4755]: I0317 01:02:26.245489 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:26 crc kubenswrapper[4755]: I0317 01:02:26.245555 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:26 crc kubenswrapper[4755]: I0317 01:02:26.257968 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:02:26 crc kubenswrapper[4755]: E0317 01:02:26.258312 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:02:27 crc kubenswrapper[4755]: I0317 01:02:27.308211 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xjp5g" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="registry-server" probeResult="failure" output=< Mar 17 01:02:27 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:02:27 crc kubenswrapper[4755]: > Mar 17 01:02:36 crc kubenswrapper[4755]: I0317 01:02:36.324644 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:36 crc kubenswrapper[4755]: I0317 01:02:36.413194 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:36 crc kubenswrapper[4755]: I0317 01:02:36.589757 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:37 crc kubenswrapper[4755]: I0317 01:02:37.248944 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:02:37 crc kubenswrapper[4755]: E0317 01:02:37.249810 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:02:37 crc kubenswrapper[4755]: I0317 01:02:37.798921 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xjp5g" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="registry-server" containerID="cri-o://2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71" gracePeriod=2 Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.323780 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.420072 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9q49\" (UniqueName: \"kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49\") pod \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.420354 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities\") pod \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.420520 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content\") pod \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\" (UID: \"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6\") " Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.421063 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities" (OuterVolumeSpecName: "utilities") pod "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" (UID: "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.421292 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.428367 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49" (OuterVolumeSpecName: "kube-api-access-t9q49") pod "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" (UID: "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6"). InnerVolumeSpecName "kube-api-access-t9q49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.524745 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9q49\" (UniqueName: \"kubernetes.io/projected/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-kube-api-access-t9q49\") on node \"crc\" DevicePath \"\"" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.576537 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" (UID: "fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.626826 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.812173 4755 generic.go:334] "Generic (PLEG): container finished" podID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerID="2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71" exitCode=0 Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.812221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerDied","Data":"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71"} Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.812259 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xjp5g" event={"ID":"fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6","Type":"ContainerDied","Data":"a06b2c44aa0a2f986eb6ba49b559807b6f47b1a144112165f3734d57942e7f6c"} Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.812270 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xjp5g" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.812283 4755 scope.go:117] "RemoveContainer" containerID="2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.834920 4755 scope.go:117] "RemoveContainer" containerID="418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.873833 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.877257 4755 scope.go:117] "RemoveContainer" containerID="99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.884862 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xjp5g"] Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.936345 4755 scope.go:117] "RemoveContainer" containerID="2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71" Mar 17 01:02:38 crc kubenswrapper[4755]: E0317 01:02:38.937175 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71\": container with ID starting with 2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71 not found: ID does not exist" containerID="2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.937230 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71"} err="failed to get container status \"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71\": rpc error: code = NotFound desc = could not find container \"2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71\": container with ID starting with 2e998aa00ac180dc9f87f9e0f6487ad033c12ffaa1ceab02ccd920972ae76f71 not found: ID does not exist" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.937263 4755 scope.go:117] "RemoveContainer" containerID="418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2" Mar 17 01:02:38 crc kubenswrapper[4755]: E0317 01:02:38.937941 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2\": container with ID starting with 418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2 not found: ID does not exist" containerID="418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.937999 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2"} err="failed to get container status \"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2\": rpc error: code = NotFound desc = could not find container \"418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2\": container with ID starting with 418f25eb5bbcf442164580c5df2a22acda988736123cf4da5993bc21424cc1b2 not found: ID does not exist" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.938037 4755 scope.go:117] "RemoveContainer" containerID="99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0" Mar 17 01:02:38 crc kubenswrapper[4755]: E0317 01:02:38.938423 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0\": container with ID starting with 99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0 not found: ID does not exist" containerID="99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0" Mar 17 01:02:38 crc kubenswrapper[4755]: I0317 01:02:38.938511 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0"} err="failed to get container status \"99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0\": rpc error: code = NotFound desc = could not find container \"99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0\": container with ID starting with 99e5a0748366722d9fe99195cd71952db25d57291a1084456ad37f73f70d1fa0 not found: ID does not exist" Mar 17 01:02:40 crc kubenswrapper[4755]: I0317 01:02:40.272508 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" path="/var/lib/kubelet/pods/fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6/volumes" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.317728 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:02:45 crc kubenswrapper[4755]: E0317 01:02:45.318663 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="registry-server" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.318679 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="registry-server" Mar 17 01:02:45 crc kubenswrapper[4755]: E0317 01:02:45.318694 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="extract-utilities" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.318702 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="extract-utilities" Mar 17 01:02:45 crc kubenswrapper[4755]: E0317 01:02:45.318720 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="extract-content" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.318728 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="extract-content" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.318955 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7eb35e-c2bc-4ff1-ad92-4f321a964ff6" containerName="registry-server" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.321051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.338566 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.386711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.386802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswms\" (UniqueName: \"kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.386930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.488833 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.488914 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswms\" (UniqueName: \"kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.488972 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.489417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.489540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.527094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswms\" (UniqueName: \"kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms\") pod \"community-operators-kgch6\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:45 crc kubenswrapper[4755]: I0317 01:02:45.692332 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:02:46 crc kubenswrapper[4755]: I0317 01:02:46.280059 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:02:46 crc kubenswrapper[4755]: I0317 01:02:46.951673 4755 generic.go:334] "Generic (PLEG): container finished" podID="d43c7925-198f-4099-b1e7-72a8674c0693" containerID="a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18" exitCode=0 Mar 17 01:02:46 crc kubenswrapper[4755]: I0317 01:02:46.951761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerDied","Data":"a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18"} Mar 17 01:02:46 crc kubenswrapper[4755]: I0317 01:02:46.951956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerStarted","Data":"2776e1a465642b657faad6f817f232990e083845e04fb5aeac199b5e6902edd6"} Mar 17 01:02:51 crc kubenswrapper[4755]: I0317 01:02:51.248625 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:02:51 crc kubenswrapper[4755]: E0317 01:02:51.250207 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:02:55 crc kubenswrapper[4755]: I0317 01:02:55.044774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerStarted","Data":"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1"} Mar 17 01:02:57 crc kubenswrapper[4755]: I0317 01:02:57.070194 4755 generic.go:334] "Generic (PLEG): container finished" podID="d43c7925-198f-4099-b1e7-72a8674c0693" containerID="a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1" exitCode=0 Mar 17 01:02:57 crc kubenswrapper[4755]: I0317 01:02:57.070294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerDied","Data":"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1"} Mar 17 01:02:58 crc kubenswrapper[4755]: I0317 01:02:58.082153 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerStarted","Data":"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f"} Mar 17 01:02:58 crc kubenswrapper[4755]: I0317 01:02:58.110834 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgch6" podStartSLOduration=2.552340617 podStartE2EDuration="13.110818477s" podCreationTimestamp="2026-03-17 01:02:45 +0000 UTC" firstStartedPulling="2026-03-17 01:02:46.954048914 +0000 UTC m=+2441.713501227" lastFinishedPulling="2026-03-17 01:02:57.512526764 +0000 UTC m=+2452.271979087" observedRunningTime="2026-03-17 01:02:58.102499544 +0000 UTC m=+2452.861951827" watchObservedRunningTime="2026-03-17 01:02:58.110818477 +0000 UTC m=+2452.870270760" Mar 17 01:03:00 crc kubenswrapper[4755]: I0317 01:03:00.245309 4755 scope.go:117] "RemoveContainer" containerID="7a4522abdf56efa94c8540dfec50df83e7fdcdd6ae4c8818ff3e2a581c0ad9bb" Mar 17 01:03:03 crc kubenswrapper[4755]: I0317 01:03:03.248891 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:03:03 crc kubenswrapper[4755]: E0317 01:03:03.249892 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:03:05 crc kubenswrapper[4755]: I0317 01:03:05.693492 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:05 crc kubenswrapper[4755]: I0317 01:03:05.693816 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:05 crc kubenswrapper[4755]: I0317 01:03:05.784901 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:06 crc kubenswrapper[4755]: I0317 01:03:06.355638 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:06 crc kubenswrapper[4755]: I0317 01:03:06.420390 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.236611 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgch6" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="registry-server" containerID="cri-o://82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f" gracePeriod=2 Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.749484 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.845663 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities\") pod \"d43c7925-198f-4099-b1e7-72a8674c0693\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.845746 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswms\" (UniqueName: \"kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms\") pod \"d43c7925-198f-4099-b1e7-72a8674c0693\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.845864 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content\") pod \"d43c7925-198f-4099-b1e7-72a8674c0693\" (UID: \"d43c7925-198f-4099-b1e7-72a8674c0693\") " Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.846779 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities" (OuterVolumeSpecName: "utilities") pod "d43c7925-198f-4099-b1e7-72a8674c0693" (UID: "d43c7925-198f-4099-b1e7-72a8674c0693"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.857661 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms" (OuterVolumeSpecName: "kube-api-access-vswms") pod "d43c7925-198f-4099-b1e7-72a8674c0693" (UID: "d43c7925-198f-4099-b1e7-72a8674c0693"). InnerVolumeSpecName "kube-api-access-vswms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.892107 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d43c7925-198f-4099-b1e7-72a8674c0693" (UID: "d43c7925-198f-4099-b1e7-72a8674c0693"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.948429 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.948471 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43c7925-198f-4099-b1e7-72a8674c0693-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:08 crc kubenswrapper[4755]: I0317 01:03:08.948502 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswms\" (UniqueName: \"kubernetes.io/projected/d43c7925-198f-4099-b1e7-72a8674c0693-kube-api-access-vswms\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.248264 4755 generic.go:334] "Generic (PLEG): container finished" podID="d43c7925-198f-4099-b1e7-72a8674c0693" containerID="82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f" exitCode=0 Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.248304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerDied","Data":"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f"} Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.248350 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgch6" event={"ID":"d43c7925-198f-4099-b1e7-72a8674c0693","Type":"ContainerDied","Data":"2776e1a465642b657faad6f817f232990e083845e04fb5aeac199b5e6902edd6"} Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.248377 4755 scope.go:117] "RemoveContainer" containerID="82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.248470 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgch6" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.285762 4755 scope.go:117] "RemoveContainer" containerID="a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.305564 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.316034 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgch6"] Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.318310 4755 scope.go:117] "RemoveContainer" containerID="a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.365088 4755 scope.go:117] "RemoveContainer" containerID="82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f" Mar 17 01:03:09 crc kubenswrapper[4755]: E0317 01:03:09.365602 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f\": container with ID starting with 82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f not found: ID does not exist" containerID="82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.365640 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f"} err="failed to get container status \"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f\": rpc error: code = NotFound desc = could not find container \"82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f\": container with ID starting with 82ab4dc0e2cb741da1c66a20f557f393a049d08cb7299ac9addaa4b90fb1b51f not found: ID does not exist" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.365666 4755 scope.go:117] "RemoveContainer" containerID="a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1" Mar 17 01:03:09 crc kubenswrapper[4755]: E0317 01:03:09.366046 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1\": container with ID starting with a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1 not found: ID does not exist" containerID="a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.366068 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1"} err="failed to get container status \"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1\": rpc error: code = NotFound desc = could not find container \"a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1\": container with ID starting with a275e053abb3d0733b5ba2385f06849a80018a4979aa131b4f54b962d54d8da1 not found: ID does not exist" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.366081 4755 scope.go:117] "RemoveContainer" containerID="a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18" Mar 17 01:03:09 crc kubenswrapper[4755]: E0317 01:03:09.366373 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18\": container with ID starting with a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18 not found: ID does not exist" containerID="a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18" Mar 17 01:03:09 crc kubenswrapper[4755]: I0317 01:03:09.366395 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18"} err="failed to get container status \"a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18\": rpc error: code = NotFound desc = could not find container \"a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18\": container with ID starting with a38b20a6d508d36645752ed8cfb3807a629f742ce41dc6771682e1dc943aed18 not found: ID does not exist" Mar 17 01:03:10 crc kubenswrapper[4755]: I0317 01:03:10.267089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" path="/var/lib/kubelet/pods/d43c7925-198f-4099-b1e7-72a8674c0693/volumes" Mar 17 01:03:17 crc kubenswrapper[4755]: I0317 01:03:17.249805 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:03:17 crc kubenswrapper[4755]: E0317 01:03:17.250899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:03:22 crc kubenswrapper[4755]: I0317 01:03:22.394362 4755 generic.go:334] "Generic (PLEG): container finished" podID="863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" containerID="d6bea1b8db6d043636a77e356d58b2f74f2aa3ece2c18eac6a510df385d2f3be" exitCode=0 Mar 17 01:03:22 crc kubenswrapper[4755]: I0317 01:03:22.394548 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" event={"ID":"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287","Type":"ContainerDied","Data":"d6bea1b8db6d043636a77e356d58b2f74f2aa3ece2c18eac6a510df385d2f3be"} Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.015336 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.137963 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory\") pod \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.138105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0\") pod \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.138134 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnzz9\" (UniqueName: \"kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9\") pod \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.138640 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle\") pod \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.138705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam\") pod \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\" (UID: \"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287\") " Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.143865 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" (UID: "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.144548 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9" (OuterVolumeSpecName: "kube-api-access-bnzz9") pod "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" (UID: "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287"). InnerVolumeSpecName "kube-api-access-bnzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.166772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" (UID: "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.184025 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory" (OuterVolumeSpecName: "inventory") pod "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" (UID: "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.194800 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" (UID: "863b8e83-25d4-4ca9-b2c8-f4cea5ec3287"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.241565 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.241592 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnzz9\" (UniqueName: \"kubernetes.io/projected/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-kube-api-access-bnzz9\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.241604 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.241613 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.241633 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.420775 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" event={"ID":"863b8e83-25d4-4ca9-b2c8-f4cea5ec3287","Type":"ContainerDied","Data":"c7663c2979e2d3a0d39f2275d30faa4d4e8307d76f5b4685af29145d2f861e9d"} Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.421228 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7663c2979e2d3a0d39f2275d30faa4d4e8307d76f5b4685af29145d2f861e9d" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.420841 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.544483 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w"] Mar 17 01:03:24 crc kubenswrapper[4755]: E0317 01:03:24.544926 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.544950 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:03:24 crc kubenswrapper[4755]: E0317 01:03:24.544968 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="extract-utilities" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.544978 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="extract-utilities" Mar 17 01:03:24 crc kubenswrapper[4755]: E0317 01:03:24.545004 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="registry-server" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.545013 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="registry-server" Mar 17 01:03:24 crc kubenswrapper[4755]: E0317 01:03:24.545031 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="extract-content" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.545039 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="extract-content" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.545481 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43c7925-198f-4099-b1e7-72a8674c0693" containerName="registry-server" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.545503 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.549524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.553754 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.554091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.554499 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.554790 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.554941 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.564420 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w"] Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650073 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650156 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650200 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl56k\" (UniqueName: \"kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.650376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752869 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.752961 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl56k\" (UniqueName: \"kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.758274 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.759248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.761466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.761896 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.765124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.767705 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.770824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl56k\" (UniqueName: \"kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xz89w\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:24 crc kubenswrapper[4755]: I0317 01:03:24.902504 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:03:25 crc kubenswrapper[4755]: W0317 01:03:25.574852 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a218d46_6cdf_4d9e_831a_9d2af4051dcf.slice/crio-04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f WatchSource:0}: Error finding container 04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f: Status 404 returned error can't find the container with id 04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f Mar 17 01:03:25 crc kubenswrapper[4755]: I0317 01:03:25.576260 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w"] Mar 17 01:03:26 crc kubenswrapper[4755]: I0317 01:03:26.450119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" event={"ID":"8a218d46-6cdf-4d9e-831a-9d2af4051dcf","Type":"ContainerStarted","Data":"ae0ff5b23c838feb48eb87c5989c59c4b3c4f15e4c0828ba54fe1b9eb157963a"} Mar 17 01:03:26 crc kubenswrapper[4755]: I0317 01:03:26.450529 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" event={"ID":"8a218d46-6cdf-4d9e-831a-9d2af4051dcf","Type":"ContainerStarted","Data":"04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f"} Mar 17 01:03:26 crc kubenswrapper[4755]: I0317 01:03:26.473137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" podStartSLOduration=2.035272383 podStartE2EDuration="2.473120989s" podCreationTimestamp="2026-03-17 01:03:24 +0000 UTC" firstStartedPulling="2026-03-17 01:03:25.578711176 +0000 UTC m=+2480.338163459" lastFinishedPulling="2026-03-17 01:03:26.016559772 +0000 UTC m=+2480.776012065" observedRunningTime="2026-03-17 01:03:26.472689128 +0000 UTC m=+2481.232141401" watchObservedRunningTime="2026-03-17 01:03:26.473120989 +0000 UTC m=+2481.232573272" Mar 17 01:03:28 crc kubenswrapper[4755]: I0317 01:03:28.249761 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:03:28 crc kubenswrapper[4755]: E0317 01:03:28.250498 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:03:41 crc kubenswrapper[4755]: I0317 01:03:41.248772 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:03:41 crc kubenswrapper[4755]: E0317 01:03:41.251527 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:03:56 crc kubenswrapper[4755]: I0317 01:03:56.258197 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:03:56 crc kubenswrapper[4755]: E0317 01:03:56.259355 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.172476 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561824-9wpxw"] Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.174216 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.177629 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.178662 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.178849 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.197900 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561824-9wpxw"] Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.207948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4wk\" (UniqueName: \"kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk\") pod \"auto-csr-approver-29561824-9wpxw\" (UID: \"3de24587-477f-43bd-a36f-d5cf535b3c87\") " pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.311271 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4wk\" (UniqueName: \"kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk\") pod \"auto-csr-approver-29561824-9wpxw\" (UID: \"3de24587-477f-43bd-a36f-d5cf535b3c87\") " pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.340283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4wk\" (UniqueName: \"kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk\") pod \"auto-csr-approver-29561824-9wpxw\" (UID: \"3de24587-477f-43bd-a36f-d5cf535b3c87\") " pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:00 crc kubenswrapper[4755]: I0317 01:04:00.503165 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:01 crc kubenswrapper[4755]: I0317 01:04:01.101088 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561824-9wpxw"] Mar 17 01:04:01 crc kubenswrapper[4755]: I0317 01:04:01.878316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" event={"ID":"3de24587-477f-43bd-a36f-d5cf535b3c87","Type":"ContainerStarted","Data":"35e8c03c02d30a5d51ad8da3bbcc97e403622e8821e7e554504357717e2a6ad8"} Mar 17 01:04:02 crc kubenswrapper[4755]: I0317 01:04:02.892018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" event={"ID":"3de24587-477f-43bd-a36f-d5cf535b3c87","Type":"ContainerStarted","Data":"676cf755b51dd3f3afc292c313f0fb9116b903e84962f7b2fd85d590a59d9834"} Mar 17 01:04:02 crc kubenswrapper[4755]: I0317 01:04:02.917384 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" podStartSLOduration=1.569428558 podStartE2EDuration="2.917361516s" podCreationTimestamp="2026-03-17 01:04:00 +0000 UTC" firstStartedPulling="2026-03-17 01:04:01.090348378 +0000 UTC m=+2515.849800701" lastFinishedPulling="2026-03-17 01:04:02.438281346 +0000 UTC m=+2517.197733659" observedRunningTime="2026-03-17 01:04:02.9052097 +0000 UTC m=+2517.664662003" watchObservedRunningTime="2026-03-17 01:04:02.917361516 +0000 UTC m=+2517.676813809" Mar 17 01:04:03 crc kubenswrapper[4755]: I0317 01:04:03.918798 4755 generic.go:334] "Generic (PLEG): container finished" podID="3de24587-477f-43bd-a36f-d5cf535b3c87" containerID="676cf755b51dd3f3afc292c313f0fb9116b903e84962f7b2fd85d590a59d9834" exitCode=0 Mar 17 01:04:03 crc kubenswrapper[4755]: I0317 01:04:03.918864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" event={"ID":"3de24587-477f-43bd-a36f-d5cf535b3c87","Type":"ContainerDied","Data":"676cf755b51dd3f3afc292c313f0fb9116b903e84962f7b2fd85d590a59d9834"} Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.420275 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.462896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq4wk\" (UniqueName: \"kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk\") pod \"3de24587-477f-43bd-a36f-d5cf535b3c87\" (UID: \"3de24587-477f-43bd-a36f-d5cf535b3c87\") " Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.469936 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk" (OuterVolumeSpecName: "kube-api-access-vq4wk") pod "3de24587-477f-43bd-a36f-d5cf535b3c87" (UID: "3de24587-477f-43bd-a36f-d5cf535b3c87"). InnerVolumeSpecName "kube-api-access-vq4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.564791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq4wk\" (UniqueName: \"kubernetes.io/projected/3de24587-477f-43bd-a36f-d5cf535b3c87-kube-api-access-vq4wk\") on node \"crc\" DevicePath \"\"" Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.947833 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" event={"ID":"3de24587-477f-43bd-a36f-d5cf535b3c87","Type":"ContainerDied","Data":"35e8c03c02d30a5d51ad8da3bbcc97e403622e8821e7e554504357717e2a6ad8"} Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.948499 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e8c03c02d30a5d51ad8da3bbcc97e403622e8821e7e554504357717e2a6ad8" Mar 17 01:04:05 crc kubenswrapper[4755]: I0317 01:04:05.948617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561824-9wpxw" Mar 17 01:04:06 crc kubenswrapper[4755]: I0317 01:04:06.001475 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561818-l6l4d"] Mar 17 01:04:06 crc kubenswrapper[4755]: I0317 01:04:06.010517 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561818-l6l4d"] Mar 17 01:04:06 crc kubenswrapper[4755]: I0317 01:04:06.283010 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec04782-7ac5-4e8c-92a6-43843b88db96" path="/var/lib/kubelet/pods/bec04782-7ac5-4e8c-92a6-43843b88db96/volumes" Mar 17 01:04:09 crc kubenswrapper[4755]: I0317 01:04:09.249207 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:04:09 crc kubenswrapper[4755]: E0317 01:04:09.249795 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:04:24 crc kubenswrapper[4755]: I0317 01:04:24.256699 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:04:24 crc kubenswrapper[4755]: E0317 01:04:24.257898 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:04:38 crc kubenswrapper[4755]: I0317 01:04:38.248326 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:04:38 crc kubenswrapper[4755]: E0317 01:04:38.250118 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:04:50 crc kubenswrapper[4755]: I0317 01:04:50.247939 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:04:50 crc kubenswrapper[4755]: E0317 01:04:50.248818 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:05:00 crc kubenswrapper[4755]: I0317 01:05:00.409604 4755 scope.go:117] "RemoveContainer" containerID="5adb7843fbe74ca66aaf66bc112ba9cb867f0db097eeefa0d053a43f6fb883c9" Mar 17 01:05:03 crc kubenswrapper[4755]: I0317 01:05:03.248633 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:05:03 crc kubenswrapper[4755]: E0317 01:05:03.249517 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:05:18 crc kubenswrapper[4755]: I0317 01:05:18.249609 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:05:18 crc kubenswrapper[4755]: E0317 01:05:18.250670 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:05:30 crc kubenswrapper[4755]: I0317 01:05:30.248717 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:05:30 crc kubenswrapper[4755]: E0317 01:05:30.249866 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:05:41 crc kubenswrapper[4755]: I0317 01:05:41.248830 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:05:41 crc kubenswrapper[4755]: E0317 01:05:41.249590 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:05:56 crc kubenswrapper[4755]: I0317 01:05:56.257620 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:05:56 crc kubenswrapper[4755]: E0317 01:05:56.258847 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.171931 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561826-95sml"] Mar 17 01:06:00 crc kubenswrapper[4755]: E0317 01:06:00.173231 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de24587-477f-43bd-a36f-d5cf535b3c87" containerName="oc" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.173253 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de24587-477f-43bd-a36f-d5cf535b3c87" containerName="oc" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.173657 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de24587-477f-43bd-a36f-d5cf535b3c87" containerName="oc" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.174957 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.180286 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.180667 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.180481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.183225 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561826-95sml"] Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.242386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bvg\" (UniqueName: \"kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg\") pod \"auto-csr-approver-29561826-95sml\" (UID: \"75040ebd-e84b-49ff-8616-3ee8c3d34332\") " pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.346506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bvg\" (UniqueName: \"kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg\") pod \"auto-csr-approver-29561826-95sml\" (UID: \"75040ebd-e84b-49ff-8616-3ee8c3d34332\") " pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.372187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bvg\" (UniqueName: \"kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg\") pod \"auto-csr-approver-29561826-95sml\" (UID: \"75040ebd-e84b-49ff-8616-3ee8c3d34332\") " pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:00 crc kubenswrapper[4755]: I0317 01:06:00.521345 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:01 crc kubenswrapper[4755]: I0317 01:06:01.169343 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561826-95sml"] Mar 17 01:06:01 crc kubenswrapper[4755]: I0317 01:06:01.863638 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561826-95sml" event={"ID":"75040ebd-e84b-49ff-8616-3ee8c3d34332","Type":"ContainerStarted","Data":"1f3c73afd47facea5acc158e75060ba9edec8b57a604db8702c05b5a19527130"} Mar 17 01:06:02 crc kubenswrapper[4755]: I0317 01:06:02.878285 4755 generic.go:334] "Generic (PLEG): container finished" podID="75040ebd-e84b-49ff-8616-3ee8c3d34332" containerID="934313b54431c69157e2d7067ed74f2af94715498e892f11f6ffbc0f0493e01a" exitCode=0 Mar 17 01:06:02 crc kubenswrapper[4755]: I0317 01:06:02.878497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561826-95sml" event={"ID":"75040ebd-e84b-49ff-8616-3ee8c3d34332","Type":"ContainerDied","Data":"934313b54431c69157e2d7067ed74f2af94715498e892f11f6ffbc0f0493e01a"} Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.313845 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.439595 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9bvg\" (UniqueName: \"kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg\") pod \"75040ebd-e84b-49ff-8616-3ee8c3d34332\" (UID: \"75040ebd-e84b-49ff-8616-3ee8c3d34332\") " Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.449919 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg" (OuterVolumeSpecName: "kube-api-access-g9bvg") pod "75040ebd-e84b-49ff-8616-3ee8c3d34332" (UID: "75040ebd-e84b-49ff-8616-3ee8c3d34332"). InnerVolumeSpecName "kube-api-access-g9bvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.542675 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9bvg\" (UniqueName: \"kubernetes.io/projected/75040ebd-e84b-49ff-8616-3ee8c3d34332-kube-api-access-g9bvg\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.908014 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561826-95sml" event={"ID":"75040ebd-e84b-49ff-8616-3ee8c3d34332","Type":"ContainerDied","Data":"1f3c73afd47facea5acc158e75060ba9edec8b57a604db8702c05b5a19527130"} Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.908562 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f3c73afd47facea5acc158e75060ba9edec8b57a604db8702c05b5a19527130" Mar 17 01:06:04 crc kubenswrapper[4755]: I0317 01:06:04.908078 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561826-95sml" Mar 17 01:06:05 crc kubenswrapper[4755]: I0317 01:06:05.403228 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561820-j9wxp"] Mar 17 01:06:05 crc kubenswrapper[4755]: I0317 01:06:05.414360 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561820-j9wxp"] Mar 17 01:06:06 crc kubenswrapper[4755]: I0317 01:06:06.269958 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5238f7e-f046-4dbb-9433-78786791d6cd" path="/var/lib/kubelet/pods/b5238f7e-f046-4dbb-9433-78786791d6cd/volumes" Mar 17 01:06:10 crc kubenswrapper[4755]: I0317 01:06:10.249034 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:06:10 crc kubenswrapper[4755]: E0317 01:06:10.250213 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:06:25 crc kubenswrapper[4755]: I0317 01:06:25.248139 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:06:25 crc kubenswrapper[4755]: E0317 01:06:25.249082 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:06:26 crc kubenswrapper[4755]: I0317 01:06:26.173180 4755 generic.go:334] "Generic (PLEG): container finished" podID="8a218d46-6cdf-4d9e-831a-9d2af4051dcf" containerID="ae0ff5b23c838feb48eb87c5989c59c4b3c4f15e4c0828ba54fe1b9eb157963a" exitCode=0 Mar 17 01:06:26 crc kubenswrapper[4755]: I0317 01:06:26.173217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" event={"ID":"8a218d46-6cdf-4d9e-831a-9d2af4051dcf","Type":"ContainerDied","Data":"ae0ff5b23c838feb48eb87c5989c59c4b3c4f15e4c0828ba54fe1b9eb157963a"} Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.645317 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.672383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.672723 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.673623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.673895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.674015 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl56k\" (UniqueName: \"kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.674126 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.674231 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle\") pod \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\" (UID: \"8a218d46-6cdf-4d9e-831a-9d2af4051dcf\") " Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.681754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k" (OuterVolumeSpecName: "kube-api-access-pl56k") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "kube-api-access-pl56k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.683596 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.717265 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.727533 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.732919 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.743038 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.748146 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory" (OuterVolumeSpecName: "inventory") pod "8a218d46-6cdf-4d9e-831a-9d2af4051dcf" (UID: "8a218d46-6cdf-4d9e-831a-9d2af4051dcf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776061 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776097 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776107 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776118 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl56k\" (UniqueName: \"kubernetes.io/projected/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-kube-api-access-pl56k\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776129 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776137 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:27 crc kubenswrapper[4755]: I0317 01:06:27.776146 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a218d46-6cdf-4d9e-831a-9d2af4051dcf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.198637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" event={"ID":"8a218d46-6cdf-4d9e-831a-9d2af4051dcf","Type":"ContainerDied","Data":"04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f"} Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.198975 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04fd8050cc044d0baccd252814b1f36e10a59ea60f69b0e1189b6b1ecb751b3f" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.198873 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.319859 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs"] Mar 17 01:06:28 crc kubenswrapper[4755]: E0317 01:06:28.320257 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a218d46-6cdf-4d9e-831a-9d2af4051dcf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.320272 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a218d46-6cdf-4d9e-831a-9d2af4051dcf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:06:28 crc kubenswrapper[4755]: E0317 01:06:28.320287 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75040ebd-e84b-49ff-8616-3ee8c3d34332" containerName="oc" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.320292 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="75040ebd-e84b-49ff-8616-3ee8c3d34332" containerName="oc" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.320504 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="75040ebd-e84b-49ff-8616-3ee8c3d34332" containerName="oc" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.320524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a218d46-6cdf-4d9e-831a-9d2af4051dcf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.321168 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.323896 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.331034 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.331345 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.331384 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.331499 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.347367 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs"] Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.385750 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.385832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.385924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.385994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.386096 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpk5\" (UniqueName: \"kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.386124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.386171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.488591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.488813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpk5\" (UniqueName: \"kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.488901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.489041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.489142 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.489212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.489392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.494140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.494455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.494757 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.495395 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.496686 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.497329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.508966 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpk5\" (UniqueName: \"kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:28 crc kubenswrapper[4755]: I0317 01:06:28.655275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:06:29 crc kubenswrapper[4755]: I0317 01:06:29.254042 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs"] Mar 17 01:06:29 crc kubenswrapper[4755]: W0317 01:06:29.259965 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77388131_a3d3_451a_935d_2181b1fe1216.slice/crio-bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486 WatchSource:0}: Error finding container bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486: Status 404 returned error can't find the container with id bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486 Mar 17 01:06:30 crc kubenswrapper[4755]: I0317 01:06:30.220740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" event={"ID":"77388131-a3d3-451a-935d-2181b1fe1216","Type":"ContainerStarted","Data":"347c2934d590ee41f747b23171637ace7c126262dfb4b8af2c54f0792fac2295"} Mar 17 01:06:30 crc kubenswrapper[4755]: I0317 01:06:30.221028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" event={"ID":"77388131-a3d3-451a-935d-2181b1fe1216","Type":"ContainerStarted","Data":"bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486"} Mar 17 01:06:30 crc kubenswrapper[4755]: I0317 01:06:30.250011 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" podStartSLOduration=1.676716101 podStartE2EDuration="2.249982633s" podCreationTimestamp="2026-03-17 01:06:28 +0000 UTC" firstStartedPulling="2026-03-17 01:06:29.26293812 +0000 UTC m=+2664.022390443" lastFinishedPulling="2026-03-17 01:06:29.836204662 +0000 UTC m=+2664.595656975" observedRunningTime="2026-03-17 01:06:30.24204364 +0000 UTC m=+2665.001495953" watchObservedRunningTime="2026-03-17 01:06:30.249982633 +0000 UTC m=+2665.009434966" Mar 17 01:06:38 crc kubenswrapper[4755]: I0317 01:06:38.248187 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:06:38 crc kubenswrapper[4755]: E0317 01:06:38.249282 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:06:52 crc kubenswrapper[4755]: I0317 01:06:52.248886 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:06:52 crc kubenswrapper[4755]: E0317 01:06:52.250095 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:07:00 crc kubenswrapper[4755]: I0317 01:07:00.539394 4755 scope.go:117] "RemoveContainer" containerID="3bb60e4b9a3eb3996650943fc22d8597d874a9c45e38bfbe9fae28c5b1430458" Mar 17 01:07:06 crc kubenswrapper[4755]: I0317 01:07:06.266505 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:07:06 crc kubenswrapper[4755]: I0317 01:07:06.727798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915"} Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.149830 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561828-2s4zh"] Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.152960 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.157200 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.159053 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.159518 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561828-2s4zh"] Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.161106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.341132 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8njbf\" (UniqueName: \"kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf\") pod \"auto-csr-approver-29561828-2s4zh\" (UID: \"483d1eaa-decc-40bc-844a-862fddf0d986\") " pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.443503 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8njbf\" (UniqueName: \"kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf\") pod \"auto-csr-approver-29561828-2s4zh\" (UID: \"483d1eaa-decc-40bc-844a-862fddf0d986\") " pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.474040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8njbf\" (UniqueName: \"kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf\") pod \"auto-csr-approver-29561828-2s4zh\" (UID: \"483d1eaa-decc-40bc-844a-862fddf0d986\") " pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.491453 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.989552 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561828-2s4zh"] Mar 17 01:08:00 crc kubenswrapper[4755]: I0317 01:08:00.994709 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:08:01 crc kubenswrapper[4755]: I0317 01:08:01.386794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" event={"ID":"483d1eaa-decc-40bc-844a-862fddf0d986","Type":"ContainerStarted","Data":"4d3d93263fed2a8e26b532315ea261ce1b99977c31f7c08241d1d3d807d34072"} Mar 17 01:08:03 crc kubenswrapper[4755]: I0317 01:08:03.412668 4755 generic.go:334] "Generic (PLEG): container finished" podID="483d1eaa-decc-40bc-844a-862fddf0d986" containerID="624e7b645e0d05c894a78c96d0f4900e97dbfbd9d97288367172da04c7f9ae75" exitCode=0 Mar 17 01:08:03 crc kubenswrapper[4755]: I0317 01:08:03.412755 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" event={"ID":"483d1eaa-decc-40bc-844a-862fddf0d986","Type":"ContainerDied","Data":"624e7b645e0d05c894a78c96d0f4900e97dbfbd9d97288367172da04c7f9ae75"} Mar 17 01:08:04 crc kubenswrapper[4755]: I0317 01:08:04.931393 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.089078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8njbf\" (UniqueName: \"kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf\") pod \"483d1eaa-decc-40bc-844a-862fddf0d986\" (UID: \"483d1eaa-decc-40bc-844a-862fddf0d986\") " Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.104095 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf" (OuterVolumeSpecName: "kube-api-access-8njbf") pod "483d1eaa-decc-40bc-844a-862fddf0d986" (UID: "483d1eaa-decc-40bc-844a-862fddf0d986"). InnerVolumeSpecName "kube-api-access-8njbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.191432 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8njbf\" (UniqueName: \"kubernetes.io/projected/483d1eaa-decc-40bc-844a-862fddf0d986-kube-api-access-8njbf\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.441412 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" event={"ID":"483d1eaa-decc-40bc-844a-862fddf0d986","Type":"ContainerDied","Data":"4d3d93263fed2a8e26b532315ea261ce1b99977c31f7c08241d1d3d807d34072"} Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.441495 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d3d93263fed2a8e26b532315ea261ce1b99977c31f7c08241d1d3d807d34072" Mar 17 01:08:05 crc kubenswrapper[4755]: I0317 01:08:05.441572 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561828-2s4zh" Mar 17 01:08:06 crc kubenswrapper[4755]: I0317 01:08:06.009271 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561822-7nn47"] Mar 17 01:08:06 crc kubenswrapper[4755]: I0317 01:08:06.020589 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561822-7nn47"] Mar 17 01:08:06 crc kubenswrapper[4755]: I0317 01:08:06.277164 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8485cade-b73d-435d-8038-5ca6280c2e12" path="/var/lib/kubelet/pods/8485cade-b73d-435d-8038-5ca6280c2e12/volumes" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.331579 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:36 crc kubenswrapper[4755]: E0317 01:08:36.332640 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d1eaa-decc-40bc-844a-862fddf0d986" containerName="oc" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.332654 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d1eaa-decc-40bc-844a-862fddf0d986" containerName="oc" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.332939 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d1eaa-decc-40bc-844a-862fddf0d986" containerName="oc" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.334737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.348985 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.528389 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.528665 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8gql\" (UniqueName: \"kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.528728 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.630974 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.631224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.631599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.631651 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.631815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8gql\" (UniqueName: \"kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.650048 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8gql\" (UniqueName: \"kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql\") pod \"redhat-marketplace-xk6rh\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:36 crc kubenswrapper[4755]: I0317 01:08:36.668271 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:37 crc kubenswrapper[4755]: I0317 01:08:37.124989 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:37 crc kubenswrapper[4755]: I0317 01:08:37.871461 4755 generic.go:334] "Generic (PLEG): container finished" podID="522de51f-8590-4957-9f28-c2bdbc29f858" containerID="8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616" exitCode=0 Mar 17 01:08:37 crc kubenswrapper[4755]: I0317 01:08:37.871730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerDied","Data":"8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616"} Mar 17 01:08:37 crc kubenswrapper[4755]: I0317 01:08:37.871809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerStarted","Data":"65ec7b5523cefc6fc71455a64b1f10e3268a24bfcc6a11902bc482f67c1251d8"} Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.558203 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.564234 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.572567 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.677123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.677163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.677202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tlsw\" (UniqueName: \"kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.779993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.780033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.780081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tlsw\" (UniqueName: \"kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.780790 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.780794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.806810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tlsw\" (UniqueName: \"kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw\") pod \"certified-operators-hdb6z\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.886531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerStarted","Data":"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b"} Mar 17 01:08:38 crc kubenswrapper[4755]: I0317 01:08:38.977554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.465876 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:39 crc kubenswrapper[4755]: W0317 01:08:39.475631 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb088a0e9_37cf_444f_a222_75dc7cf01cf4.slice/crio-20a79d3ec0ecfd4af4779d5e0f0d1df25431f59aeeb6e184166210f0d8fda62d WatchSource:0}: Error finding container 20a79d3ec0ecfd4af4779d5e0f0d1df25431f59aeeb6e184166210f0d8fda62d: Status 404 returned error can't find the container with id 20a79d3ec0ecfd4af4779d5e0f0d1df25431f59aeeb6e184166210f0d8fda62d Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.903211 4755 generic.go:334] "Generic (PLEG): container finished" podID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerID="8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58" exitCode=0 Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.903345 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerDied","Data":"8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58"} Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.903428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerStarted","Data":"20a79d3ec0ecfd4af4779d5e0f0d1df25431f59aeeb6e184166210f0d8fda62d"} Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.911885 4755 generic.go:334] "Generic (PLEG): container finished" podID="522de51f-8590-4957-9f28-c2bdbc29f858" containerID="c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b" exitCode=0 Mar 17 01:08:39 crc kubenswrapper[4755]: I0317 01:08:39.911969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerDied","Data":"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b"} Mar 17 01:08:40 crc kubenswrapper[4755]: I0317 01:08:40.929338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerStarted","Data":"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed"} Mar 17 01:08:40 crc kubenswrapper[4755]: I0317 01:08:40.934378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerStarted","Data":"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52"} Mar 17 01:08:41 crc kubenswrapper[4755]: I0317 01:08:41.010151 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xk6rh" podStartSLOduration=2.460163084 podStartE2EDuration="5.010123663s" podCreationTimestamp="2026-03-17 01:08:36 +0000 UTC" firstStartedPulling="2026-03-17 01:08:37.874457929 +0000 UTC m=+2792.633910222" lastFinishedPulling="2026-03-17 01:08:40.424418508 +0000 UTC m=+2795.183870801" observedRunningTime="2026-03-17 01:08:40.992677836 +0000 UTC m=+2795.752130149" watchObservedRunningTime="2026-03-17 01:08:41.010123663 +0000 UTC m=+2795.769575976" Mar 17 01:08:42 crc kubenswrapper[4755]: I0317 01:08:42.960000 4755 generic.go:334] "Generic (PLEG): container finished" podID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerID="e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed" exitCode=0 Mar 17 01:08:42 crc kubenswrapper[4755]: I0317 01:08:42.960692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerDied","Data":"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed"} Mar 17 01:08:43 crc kubenswrapper[4755]: I0317 01:08:43.999182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerStarted","Data":"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979"} Mar 17 01:08:44 crc kubenswrapper[4755]: I0317 01:08:44.021518 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdb6z" podStartSLOduration=2.539420218 podStartE2EDuration="6.021501179s" podCreationTimestamp="2026-03-17 01:08:38 +0000 UTC" firstStartedPulling="2026-03-17 01:08:39.906323863 +0000 UTC m=+2794.665776186" lastFinishedPulling="2026-03-17 01:08:43.388404824 +0000 UTC m=+2798.147857147" observedRunningTime="2026-03-17 01:08:44.017371208 +0000 UTC m=+2798.776823491" watchObservedRunningTime="2026-03-17 01:08:44.021501179 +0000 UTC m=+2798.780953462" Mar 17 01:08:46 crc kubenswrapper[4755]: I0317 01:08:46.670209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:46 crc kubenswrapper[4755]: I0317 01:08:46.671588 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:47 crc kubenswrapper[4755]: I0317 01:08:47.033674 4755 generic.go:334] "Generic (PLEG): container finished" podID="77388131-a3d3-451a-935d-2181b1fe1216" containerID="347c2934d590ee41f747b23171637ace7c126262dfb4b8af2c54f0792fac2295" exitCode=0 Mar 17 01:08:47 crc kubenswrapper[4755]: I0317 01:08:47.033731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" event={"ID":"77388131-a3d3-451a-935d-2181b1fe1216","Type":"ContainerDied","Data":"347c2934d590ee41f747b23171637ace7c126262dfb4b8af2c54f0792fac2295"} Mar 17 01:08:47 crc kubenswrapper[4755]: I0317 01:08:47.728170 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xk6rh" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="registry-server" probeResult="failure" output=< Mar 17 01:08:47 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:08:47 crc kubenswrapper[4755]: > Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.522969 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629577 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629753 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnpk5\" (UniqueName: \"kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629820 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.629847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle\") pod \"77388131-a3d3-451a-935d-2181b1fe1216\" (UID: \"77388131-a3d3-451a-935d-2181b1fe1216\") " Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.635984 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.637688 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5" (OuterVolumeSpecName: "kube-api-access-gnpk5") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "kube-api-access-gnpk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.670152 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory" (OuterVolumeSpecName: "inventory") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.677043 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.683568 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.684851 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.689773 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77388131-a3d3-451a-935d-2181b1fe1216" (UID: "77388131-a3d3-451a-935d-2181b1fe1216"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732552 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732616 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732641 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732664 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnpk5\" (UniqueName: \"kubernetes.io/projected/77388131-a3d3-451a-935d-2181b1fe1216-kube-api-access-gnpk5\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732685 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732705 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.732726 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/77388131-a3d3-451a-935d-2181b1fe1216-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.978198 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:48 crc kubenswrapper[4755]: I0317 01:08:48.978582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.051784 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.057404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" event={"ID":"77388131-a3d3-451a-935d-2181b1fe1216","Type":"ContainerDied","Data":"bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486"} Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.057483 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb11f52300c1ff77ce941f18f002bd397b0a3c79c376bbb641c86740acba4486" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.057502 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.117868 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.173396 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2"] Mar 17 01:08:49 crc kubenswrapper[4755]: E0317 01:08:49.173793 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77388131-a3d3-451a-935d-2181b1fe1216" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.173812 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="77388131-a3d3-451a-935d-2181b1fe1216" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.173994 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="77388131-a3d3-451a-935d-2181b1fe1216" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.175620 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.177652 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.177945 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.178461 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.178844 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.179161 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.195747 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2"] Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.305882 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.345379 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5g9\" (UniqueName: \"kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.345529 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.345613 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.345692 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.345736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.447668 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.448141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.448467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.448726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.449015 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5g9\" (UniqueName: \"kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.454017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.455058 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.455661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.456415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.475647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5g9\" (UniqueName: \"kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-nngr2\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:49 crc kubenswrapper[4755]: I0317 01:08:49.494269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:08:50 crc kubenswrapper[4755]: I0317 01:08:50.075917 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2"] Mar 17 01:08:50 crc kubenswrapper[4755]: W0317 01:08:50.078254 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode404edba_8f9a_4d92_a62c_afbc8fe269b5.slice/crio-ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd WatchSource:0}: Error finding container ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd: Status 404 returned error can't find the container with id ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd Mar 17 01:08:51 crc kubenswrapper[4755]: I0317 01:08:51.133247 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdb6z" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="registry-server" containerID="cri-o://aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979" gracePeriod=2 Mar 17 01:08:51 crc kubenswrapper[4755]: I0317 01:08:51.134203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" event={"ID":"e404edba-8f9a-4d92-a62c-afbc8fe269b5","Type":"ContainerStarted","Data":"5c6f062ac79dea1043f91101b877d379afa54bdc9fc0e469eaf96787ff96ec2b"} Mar 17 01:08:51 crc kubenswrapper[4755]: I0317 01:08:51.135327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" event={"ID":"e404edba-8f9a-4d92-a62c-afbc8fe269b5","Type":"ContainerStarted","Data":"ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd"} Mar 17 01:08:51 crc kubenswrapper[4755]: I0317 01:08:51.170091 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" podStartSLOduration=1.6444084129999998 podStartE2EDuration="2.170066781s" podCreationTimestamp="2026-03-17 01:08:49 +0000 UTC" firstStartedPulling="2026-03-17 01:08:50.084421306 +0000 UTC m=+2804.843873599" lastFinishedPulling="2026-03-17 01:08:50.610079644 +0000 UTC m=+2805.369531967" observedRunningTime="2026-03-17 01:08:51.169942037 +0000 UTC m=+2805.929394390" watchObservedRunningTime="2026-03-17 01:08:51.170066781 +0000 UTC m=+2805.929519074" Mar 17 01:08:51 crc kubenswrapper[4755]: I0317 01:08:51.849249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.011148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities\") pod \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.011255 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tlsw\" (UniqueName: \"kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw\") pod \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.011437 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content\") pod \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\" (UID: \"b088a0e9-37cf-444f-a222-75dc7cf01cf4\") " Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.012138 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities" (OuterVolumeSpecName: "utilities") pod "b088a0e9-37cf-444f-a222-75dc7cf01cf4" (UID: "b088a0e9-37cf-444f-a222-75dc7cf01cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.017806 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw" (OuterVolumeSpecName: "kube-api-access-5tlsw") pod "b088a0e9-37cf-444f-a222-75dc7cf01cf4" (UID: "b088a0e9-37cf-444f-a222-75dc7cf01cf4"). InnerVolumeSpecName "kube-api-access-5tlsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.088169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b088a0e9-37cf-444f-a222-75dc7cf01cf4" (UID: "b088a0e9-37cf-444f-a222-75dc7cf01cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.113186 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.113386 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b088a0e9-37cf-444f-a222-75dc7cf01cf4-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.113399 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tlsw\" (UniqueName: \"kubernetes.io/projected/b088a0e9-37cf-444f-a222-75dc7cf01cf4-kube-api-access-5tlsw\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.143762 4755 generic.go:334] "Generic (PLEG): container finished" podID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerID="aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979" exitCode=0 Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.143801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerDied","Data":"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979"} Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.143836 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdb6z" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.143846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdb6z" event={"ID":"b088a0e9-37cf-444f-a222-75dc7cf01cf4","Type":"ContainerDied","Data":"20a79d3ec0ecfd4af4779d5e0f0d1df25431f59aeeb6e184166210f0d8fda62d"} Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.143860 4755 scope.go:117] "RemoveContainer" containerID="aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.163547 4755 scope.go:117] "RemoveContainer" containerID="e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.189166 4755 scope.go:117] "RemoveContainer" containerID="8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.208068 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.217322 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdb6z"] Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.230839 4755 scope.go:117] "RemoveContainer" containerID="aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979" Mar 17 01:08:52 crc kubenswrapper[4755]: E0317 01:08:52.231300 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979\": container with ID starting with aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979 not found: ID does not exist" containerID="aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.231372 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979"} err="failed to get container status \"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979\": rpc error: code = NotFound desc = could not find container \"aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979\": container with ID starting with aa24cf12e64230401f3bec636f78f974e55c2e3d187b62d13881edeac080a979 not found: ID does not exist" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.231631 4755 scope.go:117] "RemoveContainer" containerID="e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed" Mar 17 01:08:52 crc kubenswrapper[4755]: E0317 01:08:52.232044 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed\": container with ID starting with e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed not found: ID does not exist" containerID="e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.232104 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed"} err="failed to get container status \"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed\": rpc error: code = NotFound desc = could not find container \"e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed\": container with ID starting with e81239c36287d1c04f6e985272df891e143ee2207df1c398a49ef759bdc718ed not found: ID does not exist" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.232146 4755 scope.go:117] "RemoveContainer" containerID="8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58" Mar 17 01:08:52 crc kubenswrapper[4755]: E0317 01:08:52.232682 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58\": container with ID starting with 8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58 not found: ID does not exist" containerID="8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.232738 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58"} err="failed to get container status \"8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58\": rpc error: code = NotFound desc = could not find container \"8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58\": container with ID starting with 8a0a17513a4081cbe7d4b184e2550f59d749e66d108ccc3470155bfd6cf29d58 not found: ID does not exist" Mar 17 01:08:52 crc kubenswrapper[4755]: I0317 01:08:52.264391 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" path="/var/lib/kubelet/pods/b088a0e9-37cf-444f-a222-75dc7cf01cf4/volumes" Mar 17 01:08:56 crc kubenswrapper[4755]: I0317 01:08:56.754935 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:56 crc kubenswrapper[4755]: I0317 01:08:56.830262 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:57 crc kubenswrapper[4755]: I0317 01:08:57.009992 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.218474 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xk6rh" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="registry-server" containerID="cri-o://1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52" gracePeriod=2 Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.785292 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.868309 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content\") pod \"522de51f-8590-4957-9f28-c2bdbc29f858\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.868845 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8gql\" (UniqueName: \"kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql\") pod \"522de51f-8590-4957-9f28-c2bdbc29f858\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.869019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities\") pod \"522de51f-8590-4957-9f28-c2bdbc29f858\" (UID: \"522de51f-8590-4957-9f28-c2bdbc29f858\") " Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.870090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities" (OuterVolumeSpecName: "utilities") pod "522de51f-8590-4957-9f28-c2bdbc29f858" (UID: "522de51f-8590-4957-9f28-c2bdbc29f858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.876746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql" (OuterVolumeSpecName: "kube-api-access-p8gql") pod "522de51f-8590-4957-9f28-c2bdbc29f858" (UID: "522de51f-8590-4957-9f28-c2bdbc29f858"). InnerVolumeSpecName "kube-api-access-p8gql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.914700 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "522de51f-8590-4957-9f28-c2bdbc29f858" (UID: "522de51f-8590-4957-9f28-c2bdbc29f858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.972112 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.972150 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/522de51f-8590-4957-9f28-c2bdbc29f858-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:58 crc kubenswrapper[4755]: I0317 01:08:58.972163 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8gql\" (UniqueName: \"kubernetes.io/projected/522de51f-8590-4957-9f28-c2bdbc29f858-kube-api-access-p8gql\") on node \"crc\" DevicePath \"\"" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.234064 4755 generic.go:334] "Generic (PLEG): container finished" podID="522de51f-8590-4957-9f28-c2bdbc29f858" containerID="1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52" exitCode=0 Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.234125 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerDied","Data":"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52"} Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.234143 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xk6rh" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.234174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xk6rh" event={"ID":"522de51f-8590-4957-9f28-c2bdbc29f858","Type":"ContainerDied","Data":"65ec7b5523cefc6fc71455a64b1f10e3268a24bfcc6a11902bc482f67c1251d8"} Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.234203 4755 scope.go:117] "RemoveContainer" containerID="1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.269749 4755 scope.go:117] "RemoveContainer" containerID="c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.309008 4755 scope.go:117] "RemoveContainer" containerID="8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.310223 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.320205 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xk6rh"] Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.377788 4755 scope.go:117] "RemoveContainer" containerID="1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52" Mar 17 01:08:59 crc kubenswrapper[4755]: E0317 01:08:59.378419 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52\": container with ID starting with 1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52 not found: ID does not exist" containerID="1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.378509 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52"} err="failed to get container status \"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52\": rpc error: code = NotFound desc = could not find container \"1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52\": container with ID starting with 1595723b4ae39cf4233a9a55e2ad12b485ef514bedb9b0343239f4c5f6116a52 not found: ID does not exist" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.378566 4755 scope.go:117] "RemoveContainer" containerID="c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b" Mar 17 01:08:59 crc kubenswrapper[4755]: E0317 01:08:59.379106 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b\": container with ID starting with c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b not found: ID does not exist" containerID="c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.379145 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b"} err="failed to get container status \"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b\": rpc error: code = NotFound desc = could not find container \"c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b\": container with ID starting with c2201c691b54c5193a197fafa891829ca416e489958170c1a64d2401de2a806b not found: ID does not exist" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.379173 4755 scope.go:117] "RemoveContainer" containerID="8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616" Mar 17 01:08:59 crc kubenswrapper[4755]: E0317 01:08:59.379992 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616\": container with ID starting with 8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616 not found: ID does not exist" containerID="8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616" Mar 17 01:08:59 crc kubenswrapper[4755]: I0317 01:08:59.380037 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616"} err="failed to get container status \"8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616\": rpc error: code = NotFound desc = could not find container \"8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616\": container with ID starting with 8234b402087fd9ad425a80cb73ed6d2d3a6b4f8b9215af1dad75c3a9b65cf616 not found: ID does not exist" Mar 17 01:09:00 crc kubenswrapper[4755]: I0317 01:09:00.273518 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" path="/var/lib/kubelet/pods/522de51f-8590-4957-9f28-c2bdbc29f858/volumes" Mar 17 01:09:00 crc kubenswrapper[4755]: I0317 01:09:00.688928 4755 scope.go:117] "RemoveContainer" containerID="337db9d6809a64af207530abcb6bae766436065e76958823fe2fee6fd77cda4b" Mar 17 01:09:10 crc kubenswrapper[4755]: I0317 01:09:10.384669 4755 generic.go:334] "Generic (PLEG): container finished" podID="e404edba-8f9a-4d92-a62c-afbc8fe269b5" containerID="5c6f062ac79dea1043f91101b877d379afa54bdc9fc0e469eaf96787ff96ec2b" exitCode=0 Mar 17 01:09:10 crc kubenswrapper[4755]: I0317 01:09:10.384764 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" event={"ID":"e404edba-8f9a-4d92-a62c-afbc8fe269b5","Type":"ContainerDied","Data":"5c6f062ac79dea1043f91101b877d379afa54bdc9fc0e469eaf96787ff96ec2b"} Mar 17 01:09:11 crc kubenswrapper[4755]: I0317 01:09:11.937319 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.002808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0\") pod \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.002914 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam\") pod \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.002994 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1\") pod \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.003188 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory\") pod \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.003230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc5g9\" (UniqueName: \"kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9\") pod \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\" (UID: \"e404edba-8f9a-4d92-a62c-afbc8fe269b5\") " Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.015615 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9" (OuterVolumeSpecName: "kube-api-access-fc5g9") pod "e404edba-8f9a-4d92-a62c-afbc8fe269b5" (UID: "e404edba-8f9a-4d92-a62c-afbc8fe269b5"). InnerVolumeSpecName "kube-api-access-fc5g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.035810 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "e404edba-8f9a-4d92-a62c-afbc8fe269b5" (UID: "e404edba-8f9a-4d92-a62c-afbc8fe269b5"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.041100 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory" (OuterVolumeSpecName: "inventory") pod "e404edba-8f9a-4d92-a62c-afbc8fe269b5" (UID: "e404edba-8f9a-4d92-a62c-afbc8fe269b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.042112 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "e404edba-8f9a-4d92-a62c-afbc8fe269b5" (UID: "e404edba-8f9a-4d92-a62c-afbc8fe269b5"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.071763 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e404edba-8f9a-4d92-a62c-afbc8fe269b5" (UID: "e404edba-8f9a-4d92-a62c-afbc8fe269b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.105266 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.105310 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc5g9\" (UniqueName: \"kubernetes.io/projected/e404edba-8f9a-4d92-a62c-afbc8fe269b5-kube-api-access-fc5g9\") on node \"crc\" DevicePath \"\"" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.105325 4755 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.105337 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.105350 4755 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e404edba-8f9a-4d92-a62c-afbc8fe269b5-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.407185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" event={"ID":"e404edba-8f9a-4d92-a62c-afbc8fe269b5","Type":"ContainerDied","Data":"ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd"} Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.407225 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7a76379cf7fcd43cda37342c6ba9f7c4c7bf95f525e7ee2667604e3922c3fd" Mar 17 01:09:12 crc kubenswrapper[4755]: I0317 01:09:12.407220 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2" Mar 17 01:09:28 crc kubenswrapper[4755]: I0317 01:09:28.665609 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:09:28 crc kubenswrapper[4755]: I0317 01:09:28.666577 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:09:58 crc kubenswrapper[4755]: I0317 01:09:58.665098 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:09:58 crc kubenswrapper[4755]: I0317 01:09:58.666079 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160261 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561830-xpx96"] Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160748 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e404edba-8f9a-4d92-a62c-afbc8fe269b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160763 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e404edba-8f9a-4d92-a62c-afbc8fe269b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160790 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="extract-content" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160798 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="extract-content" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160811 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="extract-utilities" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160819 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="extract-utilities" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160835 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160843 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160856 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="extract-content" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160864 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="extract-content" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160881 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="extract-utilities" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160888 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="extract-utilities" Mar 17 01:10:00 crc kubenswrapper[4755]: E0317 01:10:00.160907 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.160914 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.161155 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b088a0e9-37cf-444f-a222-75dc7cf01cf4" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.161183 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="522de51f-8590-4957-9f28-c2bdbc29f858" containerName="registry-server" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.161197 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e404edba-8f9a-4d92-a62c-afbc8fe269b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.161973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.164146 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.165521 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.165758 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.197932 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561830-xpx96"] Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.325156 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbjck\" (UniqueName: \"kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck\") pod \"auto-csr-approver-29561830-xpx96\" (UID: \"0df91bec-d970-450f-8a65-d1d1d929eff9\") " pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.427789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbjck\" (UniqueName: \"kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck\") pod \"auto-csr-approver-29561830-xpx96\" (UID: \"0df91bec-d970-450f-8a65-d1d1d929eff9\") " pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.453713 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbjck\" (UniqueName: \"kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck\") pod \"auto-csr-approver-29561830-xpx96\" (UID: \"0df91bec-d970-450f-8a65-d1d1d929eff9\") " pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:00 crc kubenswrapper[4755]: I0317 01:10:00.527651 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:01 crc kubenswrapper[4755]: I0317 01:10:01.015225 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561830-xpx96"] Mar 17 01:10:01 crc kubenswrapper[4755]: I0317 01:10:01.035632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561830-xpx96" event={"ID":"0df91bec-d970-450f-8a65-d1d1d929eff9","Type":"ContainerStarted","Data":"9937c16f6bdb98f41e6b1c0c4b14754b3dc56f5d703d02b0229004ec957b47fe"} Mar 17 01:10:03 crc kubenswrapper[4755]: I0317 01:10:03.062484 4755 generic.go:334] "Generic (PLEG): container finished" podID="0df91bec-d970-450f-8a65-d1d1d929eff9" containerID="a6fd1ddf7ca55120cf3035be92c6afc4c791aa3e057a137211317642bedcbead" exitCode=0 Mar 17 01:10:03 crc kubenswrapper[4755]: I0317 01:10:03.062637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561830-xpx96" event={"ID":"0df91bec-d970-450f-8a65-d1d1d929eff9","Type":"ContainerDied","Data":"a6fd1ddf7ca55120cf3035be92c6afc4c791aa3e057a137211317642bedcbead"} Mar 17 01:10:04 crc kubenswrapper[4755]: I0317 01:10:04.512839 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:04 crc kubenswrapper[4755]: I0317 01:10:04.617693 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbjck\" (UniqueName: \"kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck\") pod \"0df91bec-d970-450f-8a65-d1d1d929eff9\" (UID: \"0df91bec-d970-450f-8a65-d1d1d929eff9\") " Mar 17 01:10:04 crc kubenswrapper[4755]: I0317 01:10:04.626584 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck" (OuterVolumeSpecName: "kube-api-access-bbjck") pod "0df91bec-d970-450f-8a65-d1d1d929eff9" (UID: "0df91bec-d970-450f-8a65-d1d1d929eff9"). InnerVolumeSpecName "kube-api-access-bbjck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:10:04 crc kubenswrapper[4755]: I0317 01:10:04.720847 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbjck\" (UniqueName: \"kubernetes.io/projected/0df91bec-d970-450f-8a65-d1d1d929eff9-kube-api-access-bbjck\") on node \"crc\" DevicePath \"\"" Mar 17 01:10:05 crc kubenswrapper[4755]: I0317 01:10:05.090187 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561830-xpx96" event={"ID":"0df91bec-d970-450f-8a65-d1d1d929eff9","Type":"ContainerDied","Data":"9937c16f6bdb98f41e6b1c0c4b14754b3dc56f5d703d02b0229004ec957b47fe"} Mar 17 01:10:05 crc kubenswrapper[4755]: I0317 01:10:05.090475 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9937c16f6bdb98f41e6b1c0c4b14754b3dc56f5d703d02b0229004ec957b47fe" Mar 17 01:10:05 crc kubenswrapper[4755]: I0317 01:10:05.090426 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561830-xpx96" Mar 17 01:10:05 crc kubenswrapper[4755]: I0317 01:10:05.611520 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561824-9wpxw"] Mar 17 01:10:05 crc kubenswrapper[4755]: I0317 01:10:05.619777 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561824-9wpxw"] Mar 17 01:10:06 crc kubenswrapper[4755]: I0317 01:10:06.266808 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de24587-477f-43bd-a36f-d5cf535b3c87" path="/var/lib/kubelet/pods/3de24587-477f-43bd-a36f-d5cf535b3c87/volumes" Mar 17 01:10:28 crc kubenswrapper[4755]: I0317 01:10:28.665592 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:10:28 crc kubenswrapper[4755]: I0317 01:10:28.666087 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:10:28 crc kubenswrapper[4755]: I0317 01:10:28.666129 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:10:28 crc kubenswrapper[4755]: I0317 01:10:28.666839 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:10:28 crc kubenswrapper[4755]: I0317 01:10:28.666890 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915" gracePeriod=600 Mar 17 01:10:29 crc kubenswrapper[4755]: I0317 01:10:29.410221 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915" exitCode=0 Mar 17 01:10:29 crc kubenswrapper[4755]: I0317 01:10:29.411000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915"} Mar 17 01:10:29 crc kubenswrapper[4755]: I0317 01:10:29.411054 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a"} Mar 17 01:10:29 crc kubenswrapper[4755]: I0317 01:10:29.411091 4755 scope.go:117] "RemoveContainer" containerID="65aebc99a758645a913f7ad5b880b27114ca12fee180f846f670af0740013420" Mar 17 01:11:00 crc kubenswrapper[4755]: I0317 01:11:00.860189 4755 scope.go:117] "RemoveContainer" containerID="676cf755b51dd3f3afc292c313f0fb9116b903e84962f7b2fd85d590a59d9834" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.176181 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561832-vd84x"] Mar 17 01:12:00 crc kubenswrapper[4755]: E0317 01:12:00.177860 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df91bec-d970-450f-8a65-d1d1d929eff9" containerName="oc" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.177895 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df91bec-d970-450f-8a65-d1d1d929eff9" containerName="oc" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.178358 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df91bec-d970-450f-8a65-d1d1d929eff9" containerName="oc" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.180150 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.184719 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.184794 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.185336 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.215539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-vd84x"] Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.328307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnfs\" (UniqueName: \"kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs\") pod \"auto-csr-approver-29561832-vd84x\" (UID: \"5e1dcd54-6ffe-43bb-8676-fc0b90687777\") " pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.431547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnfs\" (UniqueName: \"kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs\") pod \"auto-csr-approver-29561832-vd84x\" (UID: \"5e1dcd54-6ffe-43bb-8676-fc0b90687777\") " pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.456421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnfs\" (UniqueName: \"kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs\") pod \"auto-csr-approver-29561832-vd84x\" (UID: \"5e1dcd54-6ffe-43bb-8676-fc0b90687777\") " pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:00 crc kubenswrapper[4755]: I0317 01:12:00.511929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:01 crc kubenswrapper[4755]: I0317 01:12:01.140467 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-vd84x"] Mar 17 01:12:01 crc kubenswrapper[4755]: W0317 01:12:01.143860 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e1dcd54_6ffe_43bb_8676_fc0b90687777.slice/crio-bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c WatchSource:0}: Error finding container bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c: Status 404 returned error can't find the container with id bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c Mar 17 01:12:01 crc kubenswrapper[4755]: I0317 01:12:01.567460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-vd84x" event={"ID":"5e1dcd54-6ffe-43bb-8676-fc0b90687777","Type":"ContainerStarted","Data":"bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c"} Mar 17 01:12:03 crc kubenswrapper[4755]: I0317 01:12:03.592918 4755 generic.go:334] "Generic (PLEG): container finished" podID="5e1dcd54-6ffe-43bb-8676-fc0b90687777" containerID="7ff0428730410957083750a6674ff851d2d4e0e7fb5eaee1188b2154838b0dc8" exitCode=0 Mar 17 01:12:03 crc kubenswrapper[4755]: I0317 01:12:03.592985 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-vd84x" event={"ID":"5e1dcd54-6ffe-43bb-8676-fc0b90687777","Type":"ContainerDied","Data":"7ff0428730410957083750a6674ff851d2d4e0e7fb5eaee1188b2154838b0dc8"} Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.045156 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.136224 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcnfs\" (UniqueName: \"kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs\") pod \"5e1dcd54-6ffe-43bb-8676-fc0b90687777\" (UID: \"5e1dcd54-6ffe-43bb-8676-fc0b90687777\") " Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.154371 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs" (OuterVolumeSpecName: "kube-api-access-vcnfs") pod "5e1dcd54-6ffe-43bb-8676-fc0b90687777" (UID: "5e1dcd54-6ffe-43bb-8676-fc0b90687777"). InnerVolumeSpecName "kube-api-access-vcnfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.239323 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcnfs\" (UniqueName: \"kubernetes.io/projected/5e1dcd54-6ffe-43bb-8676-fc0b90687777-kube-api-access-vcnfs\") on node \"crc\" DevicePath \"\"" Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.614041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561832-vd84x" event={"ID":"5e1dcd54-6ffe-43bb-8676-fc0b90687777","Type":"ContainerDied","Data":"bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c"} Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.614082 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbb3112fc4ad5328a0684d56dd51abb176411a980a3c5b870b994eba92fdc33c" Mar 17 01:12:05 crc kubenswrapper[4755]: I0317 01:12:05.614088 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561832-vd84x" Mar 17 01:12:06 crc kubenswrapper[4755]: I0317 01:12:06.145483 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561826-95sml"] Mar 17 01:12:06 crc kubenswrapper[4755]: I0317 01:12:06.157412 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561826-95sml"] Mar 17 01:12:06 crc kubenswrapper[4755]: I0317 01:12:06.273911 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75040ebd-e84b-49ff-8616-3ee8c3d34332" path="/var/lib/kubelet/pods/75040ebd-e84b-49ff-8616-3ee8c3d34332/volumes" Mar 17 01:12:28 crc kubenswrapper[4755]: I0317 01:12:28.664928 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:12:28 crc kubenswrapper[4755]: I0317 01:12:28.665611 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:12:58 crc kubenswrapper[4755]: I0317 01:12:58.665789 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:12:58 crc kubenswrapper[4755]: I0317 01:12:58.666554 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:13:00 crc kubenswrapper[4755]: I0317 01:13:00.974019 4755 scope.go:117] "RemoveContainer" containerID="934313b54431c69157e2d7067ed74f2af94715498e892f11f6ffbc0f0493e01a" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.804400 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:13 crc kubenswrapper[4755]: E0317 01:13:13.806110 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1dcd54-6ffe-43bb-8676-fc0b90687777" containerName="oc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.806136 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1dcd54-6ffe-43bb-8676-fc0b90687777" containerName="oc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.806521 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1dcd54-6ffe-43bb-8676-fc0b90687777" containerName="oc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.809136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.834300 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.900614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9sdp\" (UniqueName: \"kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.902557 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:13 crc kubenswrapper[4755]: I0317 01:13:13.902689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.005068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.005208 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.005376 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9sdp\" (UniqueName: \"kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.006369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.006647 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.026241 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9sdp\" (UniqueName: \"kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp\") pod \"community-operators-s6jmc\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.147197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:14 crc kubenswrapper[4755]: I0317 01:13:14.644776 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:14 crc kubenswrapper[4755]: W0317 01:13:14.649822 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ccf5f4_ea17_49a3_8fcd_0a11eee6944f.slice/crio-60175e4337afe55cc697f7bac540e35b31076acfcae407acb4caea5aa2c86d43 WatchSource:0}: Error finding container 60175e4337afe55cc697f7bac540e35b31076acfcae407acb4caea5aa2c86d43: Status 404 returned error can't find the container with id 60175e4337afe55cc697f7bac540e35b31076acfcae407acb4caea5aa2c86d43 Mar 17 01:13:15 crc kubenswrapper[4755]: I0317 01:13:15.570885 4755 generic.go:334] "Generic (PLEG): container finished" podID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerID="ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654" exitCode=0 Mar 17 01:13:15 crc kubenswrapper[4755]: I0317 01:13:15.570978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerDied","Data":"ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654"} Mar 17 01:13:15 crc kubenswrapper[4755]: I0317 01:13:15.571354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerStarted","Data":"60175e4337afe55cc697f7bac540e35b31076acfcae407acb4caea5aa2c86d43"} Mar 17 01:13:15 crc kubenswrapper[4755]: I0317 01:13:15.572999 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:13:16 crc kubenswrapper[4755]: I0317 01:13:16.590622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerStarted","Data":"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b"} Mar 17 01:13:18 crc kubenswrapper[4755]: I0317 01:13:18.627064 4755 generic.go:334] "Generic (PLEG): container finished" podID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerID="3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b" exitCode=0 Mar 17 01:13:18 crc kubenswrapper[4755]: I0317 01:13:18.627110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerDied","Data":"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b"} Mar 17 01:13:19 crc kubenswrapper[4755]: I0317 01:13:19.643823 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerStarted","Data":"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1"} Mar 17 01:13:19 crc kubenswrapper[4755]: I0317 01:13:19.685325 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6jmc" podStartSLOduration=2.926198535 podStartE2EDuration="6.685301058s" podCreationTimestamp="2026-03-17 01:13:13 +0000 UTC" firstStartedPulling="2026-03-17 01:13:15.57270533 +0000 UTC m=+3070.332157613" lastFinishedPulling="2026-03-17 01:13:19.331807813 +0000 UTC m=+3074.091260136" observedRunningTime="2026-03-17 01:13:19.676481848 +0000 UTC m=+3074.435934141" watchObservedRunningTime="2026-03-17 01:13:19.685301058 +0000 UTC m=+3074.444753351" Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.590703 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-chf29"] Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.594685 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.607787 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chf29"] Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.992372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-catalog-content\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.992702 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfpgj\" (UniqueName: \"kubernetes.io/projected/1add56c9-cfd8-4fa3-b532-e4b952f36683-kube-api-access-lfpgj\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:20 crc kubenswrapper[4755]: I0317 01:13:20.992790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-utilities\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.093993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfpgj\" (UniqueName: \"kubernetes.io/projected/1add56c9-cfd8-4fa3-b532-e4b952f36683-kube-api-access-lfpgj\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.094154 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-utilities\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.094388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-catalog-content\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.094680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-utilities\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.094905 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1add56c9-cfd8-4fa3-b532-e4b952f36683-catalog-content\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.115430 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfpgj\" (UniqueName: \"kubernetes.io/projected/1add56c9-cfd8-4fa3-b532-e4b952f36683-kube-api-access-lfpgj\") pod \"redhat-operators-chf29\" (UID: \"1add56c9-cfd8-4fa3-b532-e4b952f36683\") " pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.231822 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:21 crc kubenswrapper[4755]: I0317 01:13:21.730224 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chf29"] Mar 17 01:13:22 crc kubenswrapper[4755]: I0317 01:13:22.703738 4755 generic.go:334] "Generic (PLEG): container finished" podID="1add56c9-cfd8-4fa3-b532-e4b952f36683" containerID="177401fd532e19ae9c1f93a36d324a62acb8eb9638da12f8ffb05c41ef7be30f" exitCode=0 Mar 17 01:13:22 crc kubenswrapper[4755]: I0317 01:13:22.703811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chf29" event={"ID":"1add56c9-cfd8-4fa3-b532-e4b952f36683","Type":"ContainerDied","Data":"177401fd532e19ae9c1f93a36d324a62acb8eb9638da12f8ffb05c41ef7be30f"} Mar 17 01:13:22 crc kubenswrapper[4755]: I0317 01:13:22.704106 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chf29" event={"ID":"1add56c9-cfd8-4fa3-b532-e4b952f36683","Type":"ContainerStarted","Data":"5a5fe55e03ae993dd62a1ef384c375a34f66befbdd40943338ab2f594e9a4c54"} Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.148212 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.148858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.204238 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.658401 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.694179 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.717522 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.727847 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.737314 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.746049 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.754906 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.764574 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2dm92"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.771196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.774920 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.782105 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.791871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-gzrxs"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.802474 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t5xqn"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.810103 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.817777 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-npjh8"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.825595 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.832852 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.840006 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.847725 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.856556 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xz89w"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.865174 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.873261 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49bxz"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.881214 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5lm87"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.888937 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pc8zp"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.896473 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xkzcs"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.903927 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2sx2l"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.911847 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-nngr2"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.919871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5lm87"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.930310 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-pxvwk"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.937904 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xwws"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.947332 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xzw5g"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.956233 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lnzl9"] Mar 17 01:13:24 crc kubenswrapper[4755]: I0317 01:13:24.966741 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5sfwb"] Mar 17 01:13:25 crc kubenswrapper[4755]: I0317 01:13:25.359327 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.261064 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b682c2e-a6e2-478b-9679-5c2aaf416857" path="/var/lib/kubelet/pods/1b682c2e-a6e2-478b-9679-5c2aaf416857/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.262538 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407f6cc9-ea7c-455d-90bb-78266b1e6783" path="/var/lib/kubelet/pods/407f6cc9-ea7c-455d-90bb-78266b1e6783/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.263108 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e40f739-2496-4cd1-9d10-ecc61d250a1f" path="/var/lib/kubelet/pods/4e40f739-2496-4cd1-9d10-ecc61d250a1f/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.264323 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8b58f4-3072-450a-afae-2d18d9f34848" path="/var/lib/kubelet/pods/5e8b58f4-3072-450a-afae-2d18d9f34848/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.265065 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6294dd29-7825-45f8-a9e2-f72f0b56ead8" path="/var/lib/kubelet/pods/6294dd29-7825-45f8-a9e2-f72f0b56ead8/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.265756 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="633d8f49-d4ce-475d-841e-c5ca7261f61a" path="/var/lib/kubelet/pods/633d8f49-d4ce-475d-841e-c5ca7261f61a/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.266401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77388131-a3d3-451a-935d-2181b1fe1216" path="/var/lib/kubelet/pods/77388131-a3d3-451a-935d-2181b1fe1216/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.268231 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c" path="/var/lib/kubelet/pods/7bcbd6f4-73f7-4c3a-8fb1-f3727d9bd57c/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.268871 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863b8e83-25d4-4ca9-b2c8-f4cea5ec3287" path="/var/lib/kubelet/pods/863b8e83-25d4-4ca9-b2c8-f4cea5ec3287/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.269586 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88469a2b-b923-4a0c-ade2-fe4649316da7" path="/var/lib/kubelet/pods/88469a2b-b923-4a0c-ade2-fe4649316da7/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.270665 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a218d46-6cdf-4d9e-831a-9d2af4051dcf" path="/var/lib/kubelet/pods/8a218d46-6cdf-4d9e-831a-9d2af4051dcf/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.271240 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9" path="/var/lib/kubelet/pods/93e9f36b-6ca4-446c-b9a7-ad0dcfb286d9/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.271821 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18a331f-c8bb-46a9-ae90-38ffc6104a4d" path="/var/lib/kubelet/pods/e18a331f-c8bb-46a9-ae90-38ffc6104a4d/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.272384 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e404edba-8f9a-4d92-a62c-afbc8fe269b5" path="/var/lib/kubelet/pods/e404edba-8f9a-4d92-a62c-afbc8fe269b5/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.273413 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eecc6a07-bab0-487c-8948-3ed46324a72f" path="/var/lib/kubelet/pods/eecc6a07-bab0-487c-8948-3ed46324a72f/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.274156 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc27979-4c48-4516-a6ab-f78e066dcc17" path="/var/lib/kubelet/pods/ffc27979-4c48-4516-a6ab-f78e066dcc17/volumes" Mar 17 01:13:26 crc kubenswrapper[4755]: I0317 01:13:26.758554 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6jmc" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="registry-server" containerID="cri-o://766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1" gracePeriod=2 Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.245256 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.362331 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities\") pod \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.362515 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content\") pod \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.362817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9sdp\" (UniqueName: \"kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp\") pod \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\" (UID: \"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f\") " Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.363529 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities" (OuterVolumeSpecName: "utilities") pod "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" (UID: "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.364862 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.377959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp" (OuterVolumeSpecName: "kube-api-access-c9sdp") pod "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" (UID: "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f"). InnerVolumeSpecName "kube-api-access-c9sdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.452353 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" (UID: "c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.467190 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9sdp\" (UniqueName: \"kubernetes.io/projected/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-kube-api-access-c9sdp\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.467220 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.775308 4755 generic.go:334] "Generic (PLEG): container finished" podID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerID="766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1" exitCode=0 Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.775359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerDied","Data":"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1"} Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.775726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6jmc" event={"ID":"c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f","Type":"ContainerDied","Data":"60175e4337afe55cc697f7bac540e35b31076acfcae407acb4caea5aa2c86d43"} Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.775750 4755 scope.go:117] "RemoveContainer" containerID="766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.775427 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6jmc" Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.811781 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:27 crc kubenswrapper[4755]: I0317 01:13:27.821273 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6jmc"] Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.276100 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" path="/var/lib/kubelet/pods/c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f/volumes" Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.665711 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.665789 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.665867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.667096 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:13:28 crc kubenswrapper[4755]: I0317 01:13:28.667212 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" gracePeriod=600 Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.113361 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz"] Mar 17 01:13:29 crc kubenswrapper[4755]: E0317 01:13:29.114029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="extract-utilities" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.114042 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="extract-utilities" Mar 17 01:13:29 crc kubenswrapper[4755]: E0317 01:13:29.114069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="extract-content" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.114075 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="extract-content" Mar 17 01:13:29 crc kubenswrapper[4755]: E0317 01:13:29.114093 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="registry-server" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.114098 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="registry-server" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.114287 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ccf5f4-ea17-49a3-8fcd-0a11eee6944f" containerName="registry-server" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.115054 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.117026 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.117963 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.118131 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.118305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.118807 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.123487 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz"] Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.204350 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.204618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.204734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.204822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.204926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.306345 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.306424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.306507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.306542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.306600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.312770 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.313192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.322716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.325986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.333828 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.442467 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.805854 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" exitCode=0 Mar 17 01:13:29 crc kubenswrapper[4755]: I0317 01:13:29.805939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a"} Mar 17 01:13:33 crc kubenswrapper[4755]: I0317 01:13:33.790151 4755 scope.go:117] "RemoveContainer" containerID="3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b" Mar 17 01:13:33 crc kubenswrapper[4755]: E0317 01:13:33.884328 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:13:33 crc kubenswrapper[4755]: I0317 01:13:33.993560 4755 scope.go:117] "RemoveContainer" containerID="ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.045068 4755 scope.go:117] "RemoveContainer" containerID="766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1" Mar 17 01:13:34 crc kubenswrapper[4755]: E0317 01:13:34.045510 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1\": container with ID starting with 766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1 not found: ID does not exist" containerID="766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.045567 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1"} err="failed to get container status \"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1\": rpc error: code = NotFound desc = could not find container \"766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1\": container with ID starting with 766a3e5a88009d49c321e4cb8c905cce65e1fee6c6f8c5b6c28e0e5f5708d7e1 not found: ID does not exist" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.045600 4755 scope.go:117] "RemoveContainer" containerID="3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b" Mar 17 01:13:34 crc kubenswrapper[4755]: E0317 01:13:34.048380 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b\": container with ID starting with 3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b not found: ID does not exist" containerID="3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.048422 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b"} err="failed to get container status \"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b\": rpc error: code = NotFound desc = could not find container \"3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b\": container with ID starting with 3938ff04123af62b73a72aeae630b9db1a58ae3f43e83749fe1025d3897b966b not found: ID does not exist" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.048493 4755 scope.go:117] "RemoveContainer" containerID="ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654" Mar 17 01:13:34 crc kubenswrapper[4755]: E0317 01:13:34.050057 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654\": container with ID starting with ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654 not found: ID does not exist" containerID="ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.050094 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654"} err="failed to get container status \"ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654\": rpc error: code = NotFound desc = could not find container \"ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654\": container with ID starting with ed8f1dcdd29641221ad1d0a60951dc4c84c2c031bed11861f32ae4c1ba9d4654 not found: ID does not exist" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.050123 4755 scope.go:117] "RemoveContainer" containerID="724d2ea8ccfc68fca06c3de8383346b840d2506d8f185a3e871d51ca50c11915" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.430512 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz"] Mar 17 01:13:34 crc kubenswrapper[4755]: W0317 01:13:34.430663 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f2a3043_b45c_43ea_a6fa_de300dee0390.slice/crio-83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886 WatchSource:0}: Error finding container 83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886: Status 404 returned error can't find the container with id 83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886 Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.876026 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chf29" event={"ID":"1add56c9-cfd8-4fa3-b532-e4b952f36683","Type":"ContainerStarted","Data":"94e073dfcf094578baf110dd7dda38a84f989163635efe0696633c14d85eb6a7"} Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.881941 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:13:34 crc kubenswrapper[4755]: E0317 01:13:34.882193 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:13:34 crc kubenswrapper[4755]: I0317 01:13:34.883794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" event={"ID":"6f2a3043-b45c-43ea-a6fa-de300dee0390","Type":"ContainerStarted","Data":"83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886"} Mar 17 01:13:36 crc kubenswrapper[4755]: I0317 01:13:36.912767 4755 generic.go:334] "Generic (PLEG): container finished" podID="1add56c9-cfd8-4fa3-b532-e4b952f36683" containerID="94e073dfcf094578baf110dd7dda38a84f989163635efe0696633c14d85eb6a7" exitCode=0 Mar 17 01:13:36 crc kubenswrapper[4755]: I0317 01:13:36.912854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chf29" event={"ID":"1add56c9-cfd8-4fa3-b532-e4b952f36683","Type":"ContainerDied","Data":"94e073dfcf094578baf110dd7dda38a84f989163635efe0696633c14d85eb6a7"} Mar 17 01:13:36 crc kubenswrapper[4755]: I0317 01:13:36.918120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" event={"ID":"6f2a3043-b45c-43ea-a6fa-de300dee0390","Type":"ContainerStarted","Data":"9c4d6d71cd0d20931746445664d2f3c933c05e3f1f4a5a2430b0a9bbffa32cb4"} Mar 17 01:13:36 crc kubenswrapper[4755]: I0317 01:13:36.988511 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" podStartSLOduration=6.802774502 podStartE2EDuration="7.988470056s" podCreationTimestamp="2026-03-17 01:13:29 +0000 UTC" firstStartedPulling="2026-03-17 01:13:34.433839828 +0000 UTC m=+3089.193292151" lastFinishedPulling="2026-03-17 01:13:35.619535412 +0000 UTC m=+3090.378987705" observedRunningTime="2026-03-17 01:13:36.969659674 +0000 UTC m=+3091.729111997" watchObservedRunningTime="2026-03-17 01:13:36.988470056 +0000 UTC m=+3091.747922389" Mar 17 01:13:37 crc kubenswrapper[4755]: I0317 01:13:37.936675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-chf29" event={"ID":"1add56c9-cfd8-4fa3-b532-e4b952f36683","Type":"ContainerStarted","Data":"6ff55b93f6e623168d3524e9d50099337ee2ff1836705dda3a93403af1e4d629"} Mar 17 01:13:37 crc kubenswrapper[4755]: I0317 01:13:37.959743 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-chf29" podStartSLOduration=3.172961595 podStartE2EDuration="17.959719454s" podCreationTimestamp="2026-03-17 01:13:20 +0000 UTC" firstStartedPulling="2026-03-17 01:13:22.706267825 +0000 UTC m=+3077.465720138" lastFinishedPulling="2026-03-17 01:13:37.493025684 +0000 UTC m=+3092.252477997" observedRunningTime="2026-03-17 01:13:37.957040101 +0000 UTC m=+3092.716492414" watchObservedRunningTime="2026-03-17 01:13:37.959719454 +0000 UTC m=+3092.719171747" Mar 17 01:13:41 crc kubenswrapper[4755]: I0317 01:13:41.232389 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:41 crc kubenswrapper[4755]: I0317 01:13:41.232773 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:42 crc kubenswrapper[4755]: I0317 01:13:42.284729 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-chf29" podUID="1add56c9-cfd8-4fa3-b532-e4b952f36683" containerName="registry-server" probeResult="failure" output=< Mar 17 01:13:42 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:13:42 crc kubenswrapper[4755]: > Mar 17 01:13:48 crc kubenswrapper[4755]: I0317 01:13:48.067280 4755 generic.go:334] "Generic (PLEG): container finished" podID="6f2a3043-b45c-43ea-a6fa-de300dee0390" containerID="9c4d6d71cd0d20931746445664d2f3c933c05e3f1f4a5a2430b0a9bbffa32cb4" exitCode=0 Mar 17 01:13:48 crc kubenswrapper[4755]: I0317 01:13:48.067502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" event={"ID":"6f2a3043-b45c-43ea-a6fa-de300dee0390","Type":"ContainerDied","Data":"9c4d6d71cd0d20931746445664d2f3c933c05e3f1f4a5a2430b0a9bbffa32cb4"} Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.248378 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:13:49 crc kubenswrapper[4755]: E0317 01:13:49.248931 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.594541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.690293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam\") pod \"6f2a3043-b45c-43ea-a6fa-de300dee0390\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.690802 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle\") pod \"6f2a3043-b45c-43ea-a6fa-de300dee0390\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.691139 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq\") pod \"6f2a3043-b45c-43ea-a6fa-de300dee0390\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.691417 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory\") pod \"6f2a3043-b45c-43ea-a6fa-de300dee0390\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.691744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph\") pod \"6f2a3043-b45c-43ea-a6fa-de300dee0390\" (UID: \"6f2a3043-b45c-43ea-a6fa-de300dee0390\") " Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.697669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph" (OuterVolumeSpecName: "ceph") pod "6f2a3043-b45c-43ea-a6fa-de300dee0390" (UID: "6f2a3043-b45c-43ea-a6fa-de300dee0390"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.698082 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq" (OuterVolumeSpecName: "kube-api-access-fjfmq") pod "6f2a3043-b45c-43ea-a6fa-de300dee0390" (UID: "6f2a3043-b45c-43ea-a6fa-de300dee0390"). InnerVolumeSpecName "kube-api-access-fjfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.698485 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6f2a3043-b45c-43ea-a6fa-de300dee0390" (UID: "6f2a3043-b45c-43ea-a6fa-de300dee0390"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.724801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory" (OuterVolumeSpecName: "inventory") pod "6f2a3043-b45c-43ea-a6fa-de300dee0390" (UID: "6f2a3043-b45c-43ea-a6fa-de300dee0390"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.736042 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6f2a3043-b45c-43ea-a6fa-de300dee0390" (UID: "6f2a3043-b45c-43ea-a6fa-de300dee0390"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.795040 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.795090 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.795110 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/6f2a3043-b45c-43ea-a6fa-de300dee0390-kube-api-access-fjfmq\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.795134 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:49 crc kubenswrapper[4755]: I0317 01:13:49.795154 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f2a3043-b45c-43ea-a6fa-de300dee0390-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.104385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" event={"ID":"6f2a3043-b45c-43ea-a6fa-de300dee0390","Type":"ContainerDied","Data":"83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886"} Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.104430 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e1cedc5893337972553ddb6a80c737d332c6c29d6db52d2d378deee63f4886" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.104507 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.206687 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp"] Mar 17 01:13:50 crc kubenswrapper[4755]: E0317 01:13:50.207381 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2a3043-b45c-43ea-a6fa-de300dee0390" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.207420 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2a3043-b45c-43ea-a6fa-de300dee0390" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.207888 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2a3043-b45c-43ea-a6fa-de300dee0390" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.209193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.212676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.212733 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.212858 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.212890 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.212941 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.260863 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp"] Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.307066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.307151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w785g\" (UniqueName: \"kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.307181 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.307394 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.307620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.410538 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.410719 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w785g\" (UniqueName: \"kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.410819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.410901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.410976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.416229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.418493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.418632 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.421731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.443120 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w785g\" (UniqueName: \"kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:50 crc kubenswrapper[4755]: I0317 01:13:50.579022 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.181328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp"] Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.315933 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.399236 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-chf29" Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.607105 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-chf29"] Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.786931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 01:13:51 crc kubenswrapper[4755]: I0317 01:13:51.787413 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5g7nk" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="registry-server" containerID="cri-o://e7b534284b3928e7d09b174d4a9a52910f4c898daf798967ca4cb035ac357358" gracePeriod=2 Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.146252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" event={"ID":"56db739c-5c0b-445c-bb95-d16d76daea1b","Type":"ContainerStarted","Data":"5ebce60a158e372902a2766aaf59c982f9d7fedf9c99e3e0379f342ec6e6e538"} Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.154674 4755 generic.go:334] "Generic (PLEG): container finished" podID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerID="e7b534284b3928e7d09b174d4a9a52910f4c898daf798967ca4cb035ac357358" exitCode=0 Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.155291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerDied","Data":"e7b534284b3928e7d09b174d4a9a52910f4c898daf798967ca4cb035ac357358"} Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.307901 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.467650 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities\") pod \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.468077 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content\") pod \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.468232 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnv7w\" (UniqueName: \"kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w\") pod \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\" (UID: \"977e1302-92e4-4a2d-bdc1-71027bd1ac1c\") " Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.468806 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities" (OuterVolumeSpecName: "utilities") pod "977e1302-92e4-4a2d-bdc1-71027bd1ac1c" (UID: "977e1302-92e4-4a2d-bdc1-71027bd1ac1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.473788 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w" (OuterVolumeSpecName: "kube-api-access-lnv7w") pod "977e1302-92e4-4a2d-bdc1-71027bd1ac1c" (UID: "977e1302-92e4-4a2d-bdc1-71027bd1ac1c"). InnerVolumeSpecName "kube-api-access-lnv7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.570645 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnv7w\" (UniqueName: \"kubernetes.io/projected/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-kube-api-access-lnv7w\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.570673 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.608210 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "977e1302-92e4-4a2d-bdc1-71027bd1ac1c" (UID: "977e1302-92e4-4a2d-bdc1-71027bd1ac1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:13:52 crc kubenswrapper[4755]: I0317 01:13:52.672828 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977e1302-92e4-4a2d-bdc1-71027bd1ac1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.164095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" event={"ID":"56db739c-5c0b-445c-bb95-d16d76daea1b","Type":"ContainerStarted","Data":"649963dbf73bf2350127279af4fc98e8e5c7dc36ff64afc8e7ba6ccdb547f59e"} Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.166727 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g7nk" event={"ID":"977e1302-92e4-4a2d-bdc1-71027bd1ac1c","Type":"ContainerDied","Data":"f790818efa63722105688585dff507dec2d0535c2eea25b8a5c9a6e2f8dd819b"} Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.166761 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g7nk" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.166787 4755 scope.go:117] "RemoveContainer" containerID="e7b534284b3928e7d09b174d4a9a52910f4c898daf798967ca4cb035ac357358" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.182567 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" podStartSLOduration=2.550519909 podStartE2EDuration="3.182549001s" podCreationTimestamp="2026-03-17 01:13:50 +0000 UTC" firstStartedPulling="2026-03-17 01:13:51.180528437 +0000 UTC m=+3105.939980720" lastFinishedPulling="2026-03-17 01:13:51.812557489 +0000 UTC m=+3106.572009812" observedRunningTime="2026-03-17 01:13:53.178293015 +0000 UTC m=+3107.937745298" watchObservedRunningTime="2026-03-17 01:13:53.182549001 +0000 UTC m=+3107.942001284" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.188290 4755 scope.go:117] "RemoveContainer" containerID="ccb7245c4f7b11f637f23b8f323e90f2e38dd035813885cc4e344be479345c16" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.210718 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.212587 4755 scope.go:117] "RemoveContainer" containerID="eff83594d1ec24daf6492fb27a5471811f3679e716cf509b99cd80b71976c7d2" Mar 17 01:13:53 crc kubenswrapper[4755]: I0317 01:13:53.221326 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5g7nk"] Mar 17 01:13:54 crc kubenswrapper[4755]: I0317 01:13:54.261133 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" path="/var/lib/kubelet/pods/977e1302-92e4-4a2d-bdc1-71027bd1ac1c/volumes" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.173832 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561834-cxjnb"] Mar 17 01:14:00 crc kubenswrapper[4755]: E0317 01:14:00.175175 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="extract-content" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.175198 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="extract-content" Mar 17 01:14:00 crc kubenswrapper[4755]: E0317 01:14:00.175230 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="registry-server" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.175242 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="registry-server" Mar 17 01:14:00 crc kubenswrapper[4755]: E0317 01:14:00.175269 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="extract-utilities" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.175282 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="extract-utilities" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.175674 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="977e1302-92e4-4a2d-bdc1-71027bd1ac1c" containerName="registry-server" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.176939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.179572 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.179766 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.180937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.189582 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-cxjnb"] Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.371872 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbjm\" (UniqueName: \"kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm\") pod \"auto-csr-approver-29561834-cxjnb\" (UID: \"b33bf75b-a4c4-4546-91d2-c9cc30d2b369\") " pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.474282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbjm\" (UniqueName: \"kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm\") pod \"auto-csr-approver-29561834-cxjnb\" (UID: \"b33bf75b-a4c4-4546-91d2-c9cc30d2b369\") " pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.496495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbjm\" (UniqueName: \"kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm\") pod \"auto-csr-approver-29561834-cxjnb\" (UID: \"b33bf75b-a4c4-4546-91d2-c9cc30d2b369\") " pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.533284 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:00 crc kubenswrapper[4755]: I0317 01:14:00.879294 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-cxjnb"] Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.091805 4755 scope.go:117] "RemoveContainer" containerID="4ffb41207329bc493c506d401a34a469e551a8f2e864aa30c6cfd892d7c7b349" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.179683 4755 scope.go:117] "RemoveContainer" containerID="41f7a57d57a0d6adecf75db40817a16821640e3614f4d1b0e21ad7474962c0c0" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.255937 4755 scope.go:117] "RemoveContainer" containerID="ae0ff5b23c838feb48eb87c5989c59c4b3c4f15e4c0828ba54fe1b9eb157963a" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.292652 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" event={"ID":"b33bf75b-a4c4-4546-91d2-c9cc30d2b369","Type":"ContainerStarted","Data":"bb6694862e9202ef37f4eba2168d35bc7fe12629427efd54a54ee382f8ca8b7f"} Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.345364 4755 scope.go:117] "RemoveContainer" containerID="b4a18d48b0a0f1a14180cceee5940b187e1df76f2cc4fe30702a630f9fd21c54" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.379773 4755 scope.go:117] "RemoveContainer" containerID="347c2934d590ee41f747b23171637ace7c126262dfb4b8af2c54f0792fac2295" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.456266 4755 scope.go:117] "RemoveContainer" containerID="345fb61befa7824b0566f4f85a41736d42d11b440c3edab7342208a2cb2a1d01" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.495361 4755 scope.go:117] "RemoveContainer" containerID="d6bea1b8db6d043636a77e356d58b2f74f2aa3ece2c18eac6a510df385d2f3be" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.645298 4755 scope.go:117] "RemoveContainer" containerID="b0836e19b8d58cf1fe73b1c06ab90fdad4a0cd76c6939e5225388f26d90e6398" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.695512 4755 scope.go:117] "RemoveContainer" containerID="52eab38fda4ad018fbd1b698102709d31c39c1b7e40a0fcd138bd88507a90303" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.743532 4755 scope.go:117] "RemoveContainer" containerID="866e087d43d05d588012b8e6376af6b4d5ad3069951f70a635c041b007202924" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.777797 4755 scope.go:117] "RemoveContainer" containerID="2416256bb82ee86e5f695036809e20960838ad5000395eccaf09ee8c3bf17784" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.838684 4755 scope.go:117] "RemoveContainer" containerID="1cba464c62b67c2b8db0cd83e6d9ae1064bdb85f2cf05f2390a003292d6f12b1" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.878772 4755 scope.go:117] "RemoveContainer" containerID="2fcb85a0d27bb802961885c8232dcebfb4bb33e80a312cd5f49697b248b9d89f" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.936822 4755 scope.go:117] "RemoveContainer" containerID="fb1fdf85cce1c6d66554fe1daa52304cdabefce08318c8ac6e59cd67c50b869f" Mar 17 01:14:01 crc kubenswrapper[4755]: I0317 01:14:01.977656 4755 scope.go:117] "RemoveContainer" containerID="d782e16dd6d27e698e8123e6f52a7702660886aaec567ba7f16b095fc6615008" Mar 17 01:14:03 crc kubenswrapper[4755]: I0317 01:14:03.250842 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:14:03 crc kubenswrapper[4755]: E0317 01:14:03.251670 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:14:03 crc kubenswrapper[4755]: I0317 01:14:03.330622 4755 generic.go:334] "Generic (PLEG): container finished" podID="b33bf75b-a4c4-4546-91d2-c9cc30d2b369" containerID="3b7a76c12a1d6425881cf17712aa28a841af43918dc7b246c41bdb82c4dae977" exitCode=0 Mar 17 01:14:03 crc kubenswrapper[4755]: I0317 01:14:03.330677 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" event={"ID":"b33bf75b-a4c4-4546-91d2-c9cc30d2b369","Type":"ContainerDied","Data":"3b7a76c12a1d6425881cf17712aa28a841af43918dc7b246c41bdb82c4dae977"} Mar 17 01:14:04 crc kubenswrapper[4755]: I0317 01:14:04.820421 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:04 crc kubenswrapper[4755]: I0317 01:14:04.999429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvbjm\" (UniqueName: \"kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm\") pod \"b33bf75b-a4c4-4546-91d2-c9cc30d2b369\" (UID: \"b33bf75b-a4c4-4546-91d2-c9cc30d2b369\") " Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.005166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm" (OuterVolumeSpecName: "kube-api-access-kvbjm") pod "b33bf75b-a4c4-4546-91d2-c9cc30d2b369" (UID: "b33bf75b-a4c4-4546-91d2-c9cc30d2b369"). InnerVolumeSpecName "kube-api-access-kvbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.102992 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvbjm\" (UniqueName: \"kubernetes.io/projected/b33bf75b-a4c4-4546-91d2-c9cc30d2b369-kube-api-access-kvbjm\") on node \"crc\" DevicePath \"\"" Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.365156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" event={"ID":"b33bf75b-a4c4-4546-91d2-c9cc30d2b369","Type":"ContainerDied","Data":"bb6694862e9202ef37f4eba2168d35bc7fe12629427efd54a54ee382f8ca8b7f"} Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.365210 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6694862e9202ef37f4eba2168d35bc7fe12629427efd54a54ee382f8ca8b7f" Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.365309 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561834-cxjnb" Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.931900 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561828-2s4zh"] Mar 17 01:14:05 crc kubenswrapper[4755]: I0317 01:14:05.944158 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561828-2s4zh"] Mar 17 01:14:06 crc kubenswrapper[4755]: I0317 01:14:06.271139 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483d1eaa-decc-40bc-844a-862fddf0d986" path="/var/lib/kubelet/pods/483d1eaa-decc-40bc-844a-862fddf0d986/volumes" Mar 17 01:14:16 crc kubenswrapper[4755]: I0317 01:14:16.260472 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:14:16 crc kubenswrapper[4755]: E0317 01:14:16.261551 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:14:28 crc kubenswrapper[4755]: I0317 01:14:28.248229 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:14:28 crc kubenswrapper[4755]: E0317 01:14:28.249371 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:14:43 crc kubenswrapper[4755]: I0317 01:14:43.249121 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:14:43 crc kubenswrapper[4755]: E0317 01:14:43.250207 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:14:55 crc kubenswrapper[4755]: I0317 01:14:55.248419 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:14:55 crc kubenswrapper[4755]: E0317 01:14:55.249497 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.179302 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z"] Mar 17 01:15:00 crc kubenswrapper[4755]: E0317 01:15:00.180716 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33bf75b-a4c4-4546-91d2-c9cc30d2b369" containerName="oc" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.180742 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33bf75b-a4c4-4546-91d2-c9cc30d2b369" containerName="oc" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.181116 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33bf75b-a4c4-4546-91d2-c9cc30d2b369" containerName="oc" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.182426 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.186120 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.186428 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.193816 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z"] Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.321483 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.322322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.323576 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8lc\" (UniqueName: \"kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.425893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8lc\" (UniqueName: \"kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.426098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.426311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.428053 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.433101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.444326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8lc\" (UniqueName: \"kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc\") pod \"collect-profiles-29561835-ck95z\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:00 crc kubenswrapper[4755]: I0317 01:15:00.535883 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:01 crc kubenswrapper[4755]: I0317 01:15:01.059615 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z"] Mar 17 01:15:01 crc kubenswrapper[4755]: I0317 01:15:01.166538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" event={"ID":"93a6317f-9359-4782-ac20-e4315f37a32e","Type":"ContainerStarted","Data":"2fe6284e1ee309bb721b486f025a5d4e78dbc3a0b61a4d0f5ee4dff29358fbf9"} Mar 17 01:15:02 crc kubenswrapper[4755]: I0317 01:15:02.231873 4755 generic.go:334] "Generic (PLEG): container finished" podID="93a6317f-9359-4782-ac20-e4315f37a32e" containerID="debc59f39976a4d9a1da346ac8b2ce9129d3489e99b37af27d558842e1ab5f33" exitCode=0 Mar 17 01:15:02 crc kubenswrapper[4755]: I0317 01:15:02.232139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" event={"ID":"93a6317f-9359-4782-ac20-e4315f37a32e","Type":"ContainerDied","Data":"debc59f39976a4d9a1da346ac8b2ce9129d3489e99b37af27d558842e1ab5f33"} Mar 17 01:15:02 crc kubenswrapper[4755]: I0317 01:15:02.407791 4755 scope.go:117] "RemoveContainer" containerID="5c6f062ac79dea1043f91101b877d379afa54bdc9fc0e469eaf96787ff96ec2b" Mar 17 01:15:02 crc kubenswrapper[4755]: I0317 01:15:02.435027 4755 scope.go:117] "RemoveContainer" containerID="624e7b645e0d05c894a78c96d0f4900e97dbfbd9d97288367172da04c7f9ae75" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.676946 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.724924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume\") pod \"93a6317f-9359-4782-ac20-e4315f37a32e\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.724994 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume\") pod \"93a6317f-9359-4782-ac20-e4315f37a32e\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.725168 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc8lc\" (UniqueName: \"kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc\") pod \"93a6317f-9359-4782-ac20-e4315f37a32e\" (UID: \"93a6317f-9359-4782-ac20-e4315f37a32e\") " Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.725908 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume" (OuterVolumeSpecName: "config-volume") pod "93a6317f-9359-4782-ac20-e4315f37a32e" (UID: "93a6317f-9359-4782-ac20-e4315f37a32e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.726213 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93a6317f-9359-4782-ac20-e4315f37a32e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.732151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc" (OuterVolumeSpecName: "kube-api-access-mc8lc") pod "93a6317f-9359-4782-ac20-e4315f37a32e" (UID: "93a6317f-9359-4782-ac20-e4315f37a32e"). InnerVolumeSpecName "kube-api-access-mc8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.732918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93a6317f-9359-4782-ac20-e4315f37a32e" (UID: "93a6317f-9359-4782-ac20-e4315f37a32e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.828837 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93a6317f-9359-4782-ac20-e4315f37a32e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:03 crc kubenswrapper[4755]: I0317 01:15:03.828917 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc8lc\" (UniqueName: \"kubernetes.io/projected/93a6317f-9359-4782-ac20-e4315f37a32e-kube-api-access-mc8lc\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:04 crc kubenswrapper[4755]: I0317 01:15:04.259020 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" Mar 17 01:15:04 crc kubenswrapper[4755]: I0317 01:15:04.266755 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z" event={"ID":"93a6317f-9359-4782-ac20-e4315f37a32e","Type":"ContainerDied","Data":"2fe6284e1ee309bb721b486f025a5d4e78dbc3a0b61a4d0f5ee4dff29358fbf9"} Mar 17 01:15:04 crc kubenswrapper[4755]: I0317 01:15:04.266817 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe6284e1ee309bb721b486f025a5d4e78dbc3a0b61a4d0f5ee4dff29358fbf9" Mar 17 01:15:04 crc kubenswrapper[4755]: I0317 01:15:04.783186 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm"] Mar 17 01:15:04 crc kubenswrapper[4755]: I0317 01:15:04.795011 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561790-5bdnm"] Mar 17 01:15:06 crc kubenswrapper[4755]: I0317 01:15:06.261699 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:15:06 crc kubenswrapper[4755]: E0317 01:15:06.264530 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:15:06 crc kubenswrapper[4755]: I0317 01:15:06.268737 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a3a21f-65f3-4591-9b48-b640c5e264be" path="/var/lib/kubelet/pods/d5a3a21f-65f3-4591-9b48-b640c5e264be/volumes" Mar 17 01:15:18 crc kubenswrapper[4755]: I0317 01:15:18.248699 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:15:18 crc kubenswrapper[4755]: E0317 01:15:18.250005 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:15:29 crc kubenswrapper[4755]: I0317 01:15:29.248959 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:15:29 crc kubenswrapper[4755]: E0317 01:15:29.249616 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:15:43 crc kubenswrapper[4755]: I0317 01:15:43.248186 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:15:43 crc kubenswrapper[4755]: E0317 01:15:43.248805 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:15:46 crc kubenswrapper[4755]: I0317 01:15:46.773523 4755 generic.go:334] "Generic (PLEG): container finished" podID="56db739c-5c0b-445c-bb95-d16d76daea1b" containerID="649963dbf73bf2350127279af4fc98e8e5c7dc36ff64afc8e7ba6ccdb547f59e" exitCode=0 Mar 17 01:15:46 crc kubenswrapper[4755]: I0317 01:15:46.773655 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" event={"ID":"56db739c-5c0b-445c-bb95-d16d76daea1b","Type":"ContainerDied","Data":"649963dbf73bf2350127279af4fc98e8e5c7dc36ff64afc8e7ba6ccdb547f59e"} Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.372385 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.489380 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w785g\" (UniqueName: \"kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g\") pod \"56db739c-5c0b-445c-bb95-d16d76daea1b\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.489574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam\") pod \"56db739c-5c0b-445c-bb95-d16d76daea1b\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.489658 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory\") pod \"56db739c-5c0b-445c-bb95-d16d76daea1b\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.489684 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle\") pod \"56db739c-5c0b-445c-bb95-d16d76daea1b\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.489716 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph\") pod \"56db739c-5c0b-445c-bb95-d16d76daea1b\" (UID: \"56db739c-5c0b-445c-bb95-d16d76daea1b\") " Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.496499 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g" (OuterVolumeSpecName: "kube-api-access-w785g") pod "56db739c-5c0b-445c-bb95-d16d76daea1b" (UID: "56db739c-5c0b-445c-bb95-d16d76daea1b"). InnerVolumeSpecName "kube-api-access-w785g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.496797 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph" (OuterVolumeSpecName: "ceph") pod "56db739c-5c0b-445c-bb95-d16d76daea1b" (UID: "56db739c-5c0b-445c-bb95-d16d76daea1b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.498417 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "56db739c-5c0b-445c-bb95-d16d76daea1b" (UID: "56db739c-5c0b-445c-bb95-d16d76daea1b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.527997 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory" (OuterVolumeSpecName: "inventory") pod "56db739c-5c0b-445c-bb95-d16d76daea1b" (UID: "56db739c-5c0b-445c-bb95-d16d76daea1b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.542008 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "56db739c-5c0b-445c-bb95-d16d76daea1b" (UID: "56db739c-5c0b-445c-bb95-d16d76daea1b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.592045 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.592073 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.592083 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.592091 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w785g\" (UniqueName: \"kubernetes.io/projected/56db739c-5c0b-445c-bb95-d16d76daea1b-kube-api-access-w785g\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.592102 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56db739c-5c0b-445c-bb95-d16d76daea1b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.796234 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" event={"ID":"56db739c-5c0b-445c-bb95-d16d76daea1b","Type":"ContainerDied","Data":"5ebce60a158e372902a2766aaf59c982f9d7fedf9c99e3e0379f342ec6e6e538"} Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.796271 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ebce60a158e372902a2766aaf59c982f9d7fedf9c99e3e0379f342ec6e6e538" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.796285 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.949000 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z"] Mar 17 01:15:48 crc kubenswrapper[4755]: E0317 01:15:48.949745 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a6317f-9359-4782-ac20-e4315f37a32e" containerName="collect-profiles" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.949768 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a6317f-9359-4782-ac20-e4315f37a32e" containerName="collect-profiles" Mar 17 01:15:48 crc kubenswrapper[4755]: E0317 01:15:48.949819 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56db739c-5c0b-445c-bb95-d16d76daea1b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.949829 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56db739c-5c0b-445c-bb95-d16d76daea1b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.950071 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a6317f-9359-4782-ac20-e4315f37a32e" containerName="collect-profiles" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.950111 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56db739c-5c0b-445c-bb95-d16d76daea1b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.951140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.957593 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.957630 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.957828 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.957921 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.958467 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:15:48 crc kubenswrapper[4755]: I0317 01:15:48.985256 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z"] Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.008840 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.008898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.008943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstkq\" (UniqueName: \"kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.009165 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.111425 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.111564 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstkq\" (UniqueName: \"kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.111687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.111981 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.117747 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.118129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.118137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.130164 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstkq\" (UniqueName: \"kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.276392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:15:49 crc kubenswrapper[4755]: I0317 01:15:49.915341 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z"] Mar 17 01:15:50 crc kubenswrapper[4755]: I0317 01:15:50.827585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" event={"ID":"1b83398a-b089-4a14-9432-5154d7cd107c","Type":"ContainerStarted","Data":"09334367496a532bcd8cf7e0c15c5efc4c83c08abbc6d6b0ee3a52c68945ccf5"} Mar 17 01:15:51 crc kubenswrapper[4755]: I0317 01:15:51.843031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" event={"ID":"1b83398a-b089-4a14-9432-5154d7cd107c","Type":"ContainerStarted","Data":"1959128af248650571387c6b06ae38a94eb40f697971ec98155a27c7c7720a2d"} Mar 17 01:15:51 crc kubenswrapper[4755]: I0317 01:15:51.883275 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" podStartSLOduration=3.199488036 podStartE2EDuration="3.883253442s" podCreationTimestamp="2026-03-17 01:15:48 +0000 UTC" firstStartedPulling="2026-03-17 01:15:49.910011213 +0000 UTC m=+3224.669463536" lastFinishedPulling="2026-03-17 01:15:50.593776659 +0000 UTC m=+3225.353228942" observedRunningTime="2026-03-17 01:15:51.871557346 +0000 UTC m=+3226.631009639" watchObservedRunningTime="2026-03-17 01:15:51.883253442 +0000 UTC m=+3226.642705735" Mar 17 01:15:58 crc kubenswrapper[4755]: I0317 01:15:58.248243 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:15:58 crc kubenswrapper[4755]: E0317 01:15:58.249318 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.212031 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561836-gkv2s"] Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.214460 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.216673 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.216937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.217030 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.228677 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-gkv2s"] Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.299000 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sjzf\" (UniqueName: \"kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf\") pod \"auto-csr-approver-29561836-gkv2s\" (UID: \"90071ff4-8824-4645-9881-3ff6157ac1f9\") " pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.401007 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sjzf\" (UniqueName: \"kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf\") pod \"auto-csr-approver-29561836-gkv2s\" (UID: \"90071ff4-8824-4645-9881-3ff6157ac1f9\") " pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.423321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sjzf\" (UniqueName: \"kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf\") pod \"auto-csr-approver-29561836-gkv2s\" (UID: \"90071ff4-8824-4645-9881-3ff6157ac1f9\") " pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:00 crc kubenswrapper[4755]: I0317 01:16:00.545691 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:01 crc kubenswrapper[4755]: I0317 01:16:01.127174 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-gkv2s"] Mar 17 01:16:01 crc kubenswrapper[4755]: W0317 01:16:01.133991 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90071ff4_8824_4645_9881_3ff6157ac1f9.slice/crio-ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa WatchSource:0}: Error finding container ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa: Status 404 returned error can't find the container with id ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa Mar 17 01:16:01 crc kubenswrapper[4755]: I0317 01:16:01.952427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" event={"ID":"90071ff4-8824-4645-9881-3ff6157ac1f9","Type":"ContainerStarted","Data":"ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa"} Mar 17 01:16:02 crc kubenswrapper[4755]: I0317 01:16:02.540063 4755 scope.go:117] "RemoveContainer" containerID="f4825087d18115201122d7b8eabd20ef85e8900736fe86837741eda28e4608ca" Mar 17 01:16:02 crc kubenswrapper[4755]: I0317 01:16:02.962363 4755 generic.go:334] "Generic (PLEG): container finished" podID="90071ff4-8824-4645-9881-3ff6157ac1f9" containerID="050eb673b3cf4efa16043552507b74c0e98c31c0da71f841aa5b4a234928f29b" exitCode=0 Mar 17 01:16:02 crc kubenswrapper[4755]: I0317 01:16:02.962409 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" event={"ID":"90071ff4-8824-4645-9881-3ff6157ac1f9","Type":"ContainerDied","Data":"050eb673b3cf4efa16043552507b74c0e98c31c0da71f841aa5b4a234928f29b"} Mar 17 01:16:04 crc kubenswrapper[4755]: I0317 01:16:04.465842 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:04 crc kubenswrapper[4755]: I0317 01:16:04.634189 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sjzf\" (UniqueName: \"kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf\") pod \"90071ff4-8824-4645-9881-3ff6157ac1f9\" (UID: \"90071ff4-8824-4645-9881-3ff6157ac1f9\") " Mar 17 01:16:04 crc kubenswrapper[4755]: I0317 01:16:04.640239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf" (OuterVolumeSpecName: "kube-api-access-5sjzf") pod "90071ff4-8824-4645-9881-3ff6157ac1f9" (UID: "90071ff4-8824-4645-9881-3ff6157ac1f9"). InnerVolumeSpecName "kube-api-access-5sjzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:16:04 crc kubenswrapper[4755]: I0317 01:16:04.736974 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sjzf\" (UniqueName: \"kubernetes.io/projected/90071ff4-8824-4645-9881-3ff6157ac1f9-kube-api-access-5sjzf\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:05 crc kubenswrapper[4755]: I0317 01:16:05.038698 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" event={"ID":"90071ff4-8824-4645-9881-3ff6157ac1f9","Type":"ContainerDied","Data":"ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa"} Mar 17 01:16:05 crc kubenswrapper[4755]: I0317 01:16:05.038734 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd57ec956766ecbda8b40e1e457044d3648c4494df97ec87780f1bc8f12e6aa" Mar 17 01:16:05 crc kubenswrapper[4755]: I0317 01:16:05.038779 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561836-gkv2s" Mar 17 01:16:05 crc kubenswrapper[4755]: I0317 01:16:05.549811 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561830-xpx96"] Mar 17 01:16:05 crc kubenswrapper[4755]: I0317 01:16:05.558140 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561830-xpx96"] Mar 17 01:16:06 crc kubenswrapper[4755]: I0317 01:16:06.263466 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df91bec-d970-450f-8a65-d1d1d929eff9" path="/var/lib/kubelet/pods/0df91bec-d970-450f-8a65-d1d1d929eff9/volumes" Mar 17 01:16:11 crc kubenswrapper[4755]: I0317 01:16:11.248707 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:16:11 crc kubenswrapper[4755]: E0317 01:16:11.249989 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:16:21 crc kubenswrapper[4755]: I0317 01:16:21.290480 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b83398a-b089-4a14-9432-5154d7cd107c" containerID="1959128af248650571387c6b06ae38a94eb40f697971ec98155a27c7c7720a2d" exitCode=0 Mar 17 01:16:21 crc kubenswrapper[4755]: I0317 01:16:21.290640 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" event={"ID":"1b83398a-b089-4a14-9432-5154d7cd107c","Type":"ContainerDied","Data":"1959128af248650571387c6b06ae38a94eb40f697971ec98155a27c7c7720a2d"} Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.799924 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.891296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory\") pod \"1b83398a-b089-4a14-9432-5154d7cd107c\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.891397 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vstkq\" (UniqueName: \"kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq\") pod \"1b83398a-b089-4a14-9432-5154d7cd107c\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.891458 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam\") pod \"1b83398a-b089-4a14-9432-5154d7cd107c\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.891500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph\") pod \"1b83398a-b089-4a14-9432-5154d7cd107c\" (UID: \"1b83398a-b089-4a14-9432-5154d7cd107c\") " Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.898222 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph" (OuterVolumeSpecName: "ceph") pod "1b83398a-b089-4a14-9432-5154d7cd107c" (UID: "1b83398a-b089-4a14-9432-5154d7cd107c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.898413 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq" (OuterVolumeSpecName: "kube-api-access-vstkq") pod "1b83398a-b089-4a14-9432-5154d7cd107c" (UID: "1b83398a-b089-4a14-9432-5154d7cd107c"). InnerVolumeSpecName "kube-api-access-vstkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.939816 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1b83398a-b089-4a14-9432-5154d7cd107c" (UID: "1b83398a-b089-4a14-9432-5154d7cd107c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.945599 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory" (OuterVolumeSpecName: "inventory") pod "1b83398a-b089-4a14-9432-5154d7cd107c" (UID: "1b83398a-b089-4a14-9432-5154d7cd107c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.994304 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.994573 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.994675 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b83398a-b089-4a14-9432-5154d7cd107c-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:22 crc kubenswrapper[4755]: I0317 01:16:22.994762 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vstkq\" (UniqueName: \"kubernetes.io/projected/1b83398a-b089-4a14-9432-5154d7cd107c-kube-api-access-vstkq\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.314196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" event={"ID":"1b83398a-b089-4a14-9432-5154d7cd107c","Type":"ContainerDied","Data":"09334367496a532bcd8cf7e0c15c5efc4c83c08abbc6d6b0ee3a52c68945ccf5"} Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.314239 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09334367496a532bcd8cf7e0c15c5efc4c83c08abbc6d6b0ee3a52c68945ccf5" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.314240 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.421705 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p"] Mar 17 01:16:23 crc kubenswrapper[4755]: E0317 01:16:23.422312 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90071ff4-8824-4645-9881-3ff6157ac1f9" containerName="oc" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.422333 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="90071ff4-8824-4645-9881-3ff6157ac1f9" containerName="oc" Mar 17 01:16:23 crc kubenswrapper[4755]: E0317 01:16:23.422363 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b83398a-b089-4a14-9432-5154d7cd107c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.422373 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b83398a-b089-4a14-9432-5154d7cd107c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.422601 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="90071ff4-8824-4645-9881-3ff6157ac1f9" containerName="oc" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.422642 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b83398a-b089-4a14-9432-5154d7cd107c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.423614 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.425916 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.426037 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.427350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.428623 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.429804 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.454207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p"] Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.505195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcm7j\" (UniqueName: \"kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.505348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.505517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.505571 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.607976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.608364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.608459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcm7j\" (UniqueName: \"kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.608612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.612998 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.614147 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.614214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.631129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcm7j\" (UniqueName: \"kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:23 crc kubenswrapper[4755]: I0317 01:16:23.742134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:24 crc kubenswrapper[4755]: I0317 01:16:24.329314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p"] Mar 17 01:16:25 crc kubenswrapper[4755]: I0317 01:16:25.345009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" event={"ID":"d9b4d7d9-daed-448e-b3a8-4f528207e319","Type":"ContainerStarted","Data":"bd140227c77562fdc660f8737aa11ba1128192a8784a3f24655a263592b9eb42"} Mar 17 01:16:25 crc kubenswrapper[4755]: I0317 01:16:25.345631 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" event={"ID":"d9b4d7d9-daed-448e-b3a8-4f528207e319","Type":"ContainerStarted","Data":"9c27ef8e6debe74e377c62271afe7b5827681d81c5653e56493edc3c05d0c350"} Mar 17 01:16:25 crc kubenswrapper[4755]: I0317 01:16:25.379249 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" podStartSLOduration=1.878935225 podStartE2EDuration="2.379226231s" podCreationTimestamp="2026-03-17 01:16:23 +0000 UTC" firstStartedPulling="2026-03-17 01:16:24.326625783 +0000 UTC m=+3259.086078106" lastFinishedPulling="2026-03-17 01:16:24.826916829 +0000 UTC m=+3259.586369112" observedRunningTime="2026-03-17 01:16:25.36773694 +0000 UTC m=+3260.127189233" watchObservedRunningTime="2026-03-17 01:16:25.379226231 +0000 UTC m=+3260.138678514" Mar 17 01:16:26 crc kubenswrapper[4755]: I0317 01:16:26.254259 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:16:26 crc kubenswrapper[4755]: E0317 01:16:26.254911 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:16:31 crc kubenswrapper[4755]: I0317 01:16:31.414252 4755 generic.go:334] "Generic (PLEG): container finished" podID="d9b4d7d9-daed-448e-b3a8-4f528207e319" containerID="bd140227c77562fdc660f8737aa11ba1128192a8784a3f24655a263592b9eb42" exitCode=0 Mar 17 01:16:31 crc kubenswrapper[4755]: I0317 01:16:31.414360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" event={"ID":"d9b4d7d9-daed-448e-b3a8-4f528207e319","Type":"ContainerDied","Data":"bd140227c77562fdc660f8737aa11ba1128192a8784a3f24655a263592b9eb42"} Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.004304 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.192602 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory\") pod \"d9b4d7d9-daed-448e-b3a8-4f528207e319\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.193744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam\") pod \"d9b4d7d9-daed-448e-b3a8-4f528207e319\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.194144 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcm7j\" (UniqueName: \"kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j\") pod \"d9b4d7d9-daed-448e-b3a8-4f528207e319\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.194185 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph\") pod \"d9b4d7d9-daed-448e-b3a8-4f528207e319\" (UID: \"d9b4d7d9-daed-448e-b3a8-4f528207e319\") " Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.198782 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph" (OuterVolumeSpecName: "ceph") pod "d9b4d7d9-daed-448e-b3a8-4f528207e319" (UID: "d9b4d7d9-daed-448e-b3a8-4f528207e319"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.200456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j" (OuterVolumeSpecName: "kube-api-access-lcm7j") pod "d9b4d7d9-daed-448e-b3a8-4f528207e319" (UID: "d9b4d7d9-daed-448e-b3a8-4f528207e319"). InnerVolumeSpecName "kube-api-access-lcm7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.221212 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d9b4d7d9-daed-448e-b3a8-4f528207e319" (UID: "d9b4d7d9-daed-448e-b3a8-4f528207e319"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.223929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory" (OuterVolumeSpecName: "inventory") pod "d9b4d7d9-daed-448e-b3a8-4f528207e319" (UID: "d9b4d7d9-daed-448e-b3a8-4f528207e319"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.297537 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcm7j\" (UniqueName: \"kubernetes.io/projected/d9b4d7d9-daed-448e-b3a8-4f528207e319-kube-api-access-lcm7j\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.297580 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.297593 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.297606 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9b4d7d9-daed-448e-b3a8-4f528207e319-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.436477 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" event={"ID":"d9b4d7d9-daed-448e-b3a8-4f528207e319","Type":"ContainerDied","Data":"9c27ef8e6debe74e377c62271afe7b5827681d81c5653e56493edc3c05d0c350"} Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.436513 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c27ef8e6debe74e377c62271afe7b5827681d81c5653e56493edc3c05d0c350" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.436563 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.531655 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r"] Mar 17 01:16:33 crc kubenswrapper[4755]: E0317 01:16:33.532158 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b4d7d9-daed-448e-b3a8-4f528207e319" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.532177 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b4d7d9-daed-448e-b3a8-4f528207e319" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.532357 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b4d7d9-daed-448e-b3a8-4f528207e319" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.533064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.535471 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.536160 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.536227 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.536378 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.537567 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.539182 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r"] Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.704690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.704940 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpbc\" (UniqueName: \"kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.705034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.705360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.807308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.807411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.807515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czpbc\" (UniqueName: \"kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.807549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.811339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.811372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.811417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.833637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpbc\" (UniqueName: \"kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bnr2r\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:33 crc kubenswrapper[4755]: I0317 01:16:33.900762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:16:34 crc kubenswrapper[4755]: I0317 01:16:34.484840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r"] Mar 17 01:16:35 crc kubenswrapper[4755]: I0317 01:16:35.468848 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" event={"ID":"654af424-4add-4b0f-97a6-896204b03483","Type":"ContainerStarted","Data":"47ecf7b21ffee827f7b54bd60877ed9abb356badb7bfd73aecf289c23452549c"} Mar 17 01:16:35 crc kubenswrapper[4755]: I0317 01:16:35.469294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" event={"ID":"654af424-4add-4b0f-97a6-896204b03483","Type":"ContainerStarted","Data":"7e9bc17f13f43553d984f6e909f546a6f7599af88b9f3c8caf62119cfdebff7e"} Mar 17 01:16:35 crc kubenswrapper[4755]: I0317 01:16:35.492367 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" podStartSLOduration=2.023959589 podStartE2EDuration="2.492344533s" podCreationTimestamp="2026-03-17 01:16:33 +0000 UTC" firstStartedPulling="2026-03-17 01:16:34.485534953 +0000 UTC m=+3269.244987236" lastFinishedPulling="2026-03-17 01:16:34.953919887 +0000 UTC m=+3269.713372180" observedRunningTime="2026-03-17 01:16:35.48558376 +0000 UTC m=+3270.245036053" watchObservedRunningTime="2026-03-17 01:16:35.492344533 +0000 UTC m=+3270.251796816" Mar 17 01:16:37 crc kubenswrapper[4755]: I0317 01:16:37.249074 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:16:37 crc kubenswrapper[4755]: E0317 01:16:37.250090 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:16:50 crc kubenswrapper[4755]: I0317 01:16:50.255380 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:16:50 crc kubenswrapper[4755]: E0317 01:16:50.256423 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:17:01 crc kubenswrapper[4755]: I0317 01:17:01.247941 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:17:01 crc kubenswrapper[4755]: E0317 01:17:01.248822 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:17:02 crc kubenswrapper[4755]: I0317 01:17:02.601372 4755 scope.go:117] "RemoveContainer" containerID="a6fd1ddf7ca55120cf3035be92c6afc4c791aa3e057a137211317642bedcbead" Mar 17 01:17:15 crc kubenswrapper[4755]: I0317 01:17:15.248092 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:17:15 crc kubenswrapper[4755]: E0317 01:17:15.249054 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:17:21 crc kubenswrapper[4755]: I0317 01:17:21.037622 4755 generic.go:334] "Generic (PLEG): container finished" podID="654af424-4add-4b0f-97a6-896204b03483" containerID="47ecf7b21ffee827f7b54bd60877ed9abb356badb7bfd73aecf289c23452549c" exitCode=0 Mar 17 01:17:21 crc kubenswrapper[4755]: I0317 01:17:21.037735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" event={"ID":"654af424-4add-4b0f-97a6-896204b03483","Type":"ContainerDied","Data":"47ecf7b21ffee827f7b54bd60877ed9abb356badb7bfd73aecf289c23452549c"} Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.620236 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.719934 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam\") pod \"654af424-4add-4b0f-97a6-896204b03483\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.720155 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph\") pod \"654af424-4add-4b0f-97a6-896204b03483\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.720261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czpbc\" (UniqueName: \"kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc\") pod \"654af424-4add-4b0f-97a6-896204b03483\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.725632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph" (OuterVolumeSpecName: "ceph") pod "654af424-4add-4b0f-97a6-896204b03483" (UID: "654af424-4add-4b0f-97a6-896204b03483"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.728267 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc" (OuterVolumeSpecName: "kube-api-access-czpbc") pod "654af424-4add-4b0f-97a6-896204b03483" (UID: "654af424-4add-4b0f-97a6-896204b03483"). InnerVolumeSpecName "kube-api-access-czpbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.757998 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "654af424-4add-4b0f-97a6-896204b03483" (UID: "654af424-4add-4b0f-97a6-896204b03483"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.821688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory\") pod \"654af424-4add-4b0f-97a6-896204b03483\" (UID: \"654af424-4add-4b0f-97a6-896204b03483\") " Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.822989 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czpbc\" (UniqueName: \"kubernetes.io/projected/654af424-4add-4b0f-97a6-896204b03483-kube-api-access-czpbc\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.823038 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.823064 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.867917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory" (OuterVolumeSpecName: "inventory") pod "654af424-4add-4b0f-97a6-896204b03483" (UID: "654af424-4add-4b0f-97a6-896204b03483"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:22 crc kubenswrapper[4755]: I0317 01:17:22.927988 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/654af424-4add-4b0f-97a6-896204b03483-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.075875 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" event={"ID":"654af424-4add-4b0f-97a6-896204b03483","Type":"ContainerDied","Data":"7e9bc17f13f43553d984f6e909f546a6f7599af88b9f3c8caf62119cfdebff7e"} Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.075922 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bnr2r" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.075946 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e9bc17f13f43553d984f6e909f546a6f7599af88b9f3c8caf62119cfdebff7e" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.208128 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9"] Mar 17 01:17:23 crc kubenswrapper[4755]: E0317 01:17:23.208878 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654af424-4add-4b0f-97a6-896204b03483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.208907 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="654af424-4add-4b0f-97a6-896204b03483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.209333 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="654af424-4add-4b0f-97a6-896204b03483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.210562 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.212569 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.212787 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.213533 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.213648 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.213848 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.218198 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9"] Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.337139 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42sv\" (UniqueName: \"kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.337247 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.337338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.337431 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.440108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.440499 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.440719 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42sv\" (UniqueName: \"kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.441745 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.445112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.447245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.447600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.462169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42sv\" (UniqueName: \"kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:23 crc kubenswrapper[4755]: I0317 01:17:23.538776 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:24 crc kubenswrapper[4755]: I0317 01:17:24.149342 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9"] Mar 17 01:17:25 crc kubenswrapper[4755]: I0317 01:17:25.109823 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" event={"ID":"cb099246-365d-4bd7-ad54-f765ffc586cd","Type":"ContainerStarted","Data":"9079a9fc13767a94c87b414ebd21d414cb26d167f60aead0f5c2f1c48dfd09df"} Mar 17 01:17:25 crc kubenswrapper[4755]: I0317 01:17:25.110108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" event={"ID":"cb099246-365d-4bd7-ad54-f765ffc586cd","Type":"ContainerStarted","Data":"176feb35f28b49e6c4de0d08a95785fc54786c0274cbd250d07f0842935e2f17"} Mar 17 01:17:25 crc kubenswrapper[4755]: I0317 01:17:25.135044 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" podStartSLOduration=1.722601729 podStartE2EDuration="2.13502099s" podCreationTimestamp="2026-03-17 01:17:23 +0000 UTC" firstStartedPulling="2026-03-17 01:17:24.15821498 +0000 UTC m=+3318.917667283" lastFinishedPulling="2026-03-17 01:17:24.570634271 +0000 UTC m=+3319.330086544" observedRunningTime="2026-03-17 01:17:25.130733784 +0000 UTC m=+3319.890186157" watchObservedRunningTime="2026-03-17 01:17:25.13502099 +0000 UTC m=+3319.894473273" Mar 17 01:17:28 crc kubenswrapper[4755]: I0317 01:17:28.248340 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:17:28 crc kubenswrapper[4755]: E0317 01:17:28.249345 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:17:30 crc kubenswrapper[4755]: I0317 01:17:30.175361 4755 generic.go:334] "Generic (PLEG): container finished" podID="cb099246-365d-4bd7-ad54-f765ffc586cd" containerID="9079a9fc13767a94c87b414ebd21d414cb26d167f60aead0f5c2f1c48dfd09df" exitCode=0 Mar 17 01:17:30 crc kubenswrapper[4755]: I0317 01:17:30.175497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" event={"ID":"cb099246-365d-4bd7-ad54-f765ffc586cd","Type":"ContainerDied","Data":"9079a9fc13767a94c87b414ebd21d414cb26d167f60aead0f5c2f1c48dfd09df"} Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.712361 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.834719 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph\") pod \"cb099246-365d-4bd7-ad54-f765ffc586cd\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.835076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42sv\" (UniqueName: \"kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv\") pod \"cb099246-365d-4bd7-ad54-f765ffc586cd\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.835142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam\") pod \"cb099246-365d-4bd7-ad54-f765ffc586cd\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.835237 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory\") pod \"cb099246-365d-4bd7-ad54-f765ffc586cd\" (UID: \"cb099246-365d-4bd7-ad54-f765ffc586cd\") " Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.840413 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph" (OuterVolumeSpecName: "ceph") pod "cb099246-365d-4bd7-ad54-f765ffc586cd" (UID: "cb099246-365d-4bd7-ad54-f765ffc586cd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.842634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv" (OuterVolumeSpecName: "kube-api-access-k42sv") pod "cb099246-365d-4bd7-ad54-f765ffc586cd" (UID: "cb099246-365d-4bd7-ad54-f765ffc586cd"). InnerVolumeSpecName "kube-api-access-k42sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.865406 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb099246-365d-4bd7-ad54-f765ffc586cd" (UID: "cb099246-365d-4bd7-ad54-f765ffc586cd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.873058 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory" (OuterVolumeSpecName: "inventory") pod "cb099246-365d-4bd7-ad54-f765ffc586cd" (UID: "cb099246-365d-4bd7-ad54-f765ffc586cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.938260 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.938303 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42sv\" (UniqueName: \"kubernetes.io/projected/cb099246-365d-4bd7-ad54-f765ffc586cd-kube-api-access-k42sv\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.938317 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:31 crc kubenswrapper[4755]: I0317 01:17:31.938329 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb099246-365d-4bd7-ad54-f765ffc586cd-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.203399 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" event={"ID":"cb099246-365d-4bd7-ad54-f765ffc586cd","Type":"ContainerDied","Data":"176feb35f28b49e6c4de0d08a95785fc54786c0274cbd250d07f0842935e2f17"} Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.203508 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.203516 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176feb35f28b49e6c4de0d08a95785fc54786c0274cbd250d07f0842935e2f17" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.319073 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2"] Mar 17 01:17:32 crc kubenswrapper[4755]: E0317 01:17:32.319786 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb099246-365d-4bd7-ad54-f765ffc586cd" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.319819 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb099246-365d-4bd7-ad54-f765ffc586cd" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.320222 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb099246-365d-4bd7-ad54-f765ffc586cd" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.321411 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.324782 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.324923 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.325010 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.325056 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.325233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.333227 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2"] Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.453169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.453241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.453470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.453894 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwgbc\" (UniqueName: \"kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.555892 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.555946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.555979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.556080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwgbc\" (UniqueName: \"kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.561017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.562854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.563270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.578143 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwgbc\" (UniqueName: \"kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:32 crc kubenswrapper[4755]: I0317 01:17:32.652140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:17:33 crc kubenswrapper[4755]: I0317 01:17:33.217183 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2"] Mar 17 01:17:33 crc kubenswrapper[4755]: W0317 01:17:33.221723 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351689ec_5f29_4144_ab28_25abac57ccac.slice/crio-b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9 WatchSource:0}: Error finding container b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9: Status 404 returned error can't find the container with id b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9 Mar 17 01:17:34 crc kubenswrapper[4755]: I0317 01:17:34.231247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" event={"ID":"351689ec-5f29-4144-ab28-25abac57ccac","Type":"ContainerStarted","Data":"a81003f72ae7f759f41cc81c28d34910bd7f448fcbf5460182f59a4579dcc6b6"} Mar 17 01:17:34 crc kubenswrapper[4755]: I0317 01:17:34.232078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" event={"ID":"351689ec-5f29-4144-ab28-25abac57ccac","Type":"ContainerStarted","Data":"b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9"} Mar 17 01:17:34 crc kubenswrapper[4755]: I0317 01:17:34.269909 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" podStartSLOduration=1.725378531 podStartE2EDuration="2.269880973s" podCreationTimestamp="2026-03-17 01:17:32 +0000 UTC" firstStartedPulling="2026-03-17 01:17:33.225214829 +0000 UTC m=+3327.984667122" lastFinishedPulling="2026-03-17 01:17:33.769717281 +0000 UTC m=+3328.529169564" observedRunningTime="2026-03-17 01:17:34.251521477 +0000 UTC m=+3329.010973790" watchObservedRunningTime="2026-03-17 01:17:34.269880973 +0000 UTC m=+3329.029333296" Mar 17 01:17:40 crc kubenswrapper[4755]: I0317 01:17:40.248793 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:17:40 crc kubenswrapper[4755]: E0317 01:17:40.249945 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:17:55 crc kubenswrapper[4755]: I0317 01:17:55.247830 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:17:55 crc kubenswrapper[4755]: E0317 01:17:55.248713 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.167602 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561838-89zzc"] Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.170762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.175811 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.176007 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.176162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.181953 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-89zzc"] Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.238237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4cps\" (UniqueName: \"kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps\") pod \"auto-csr-approver-29561838-89zzc\" (UID: \"a64208ce-d13d-4744-827f-c5ab7b9ffc6e\") " pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.339204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4cps\" (UniqueName: \"kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps\") pod \"auto-csr-approver-29561838-89zzc\" (UID: \"a64208ce-d13d-4744-827f-c5ab7b9ffc6e\") " pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.361385 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4cps\" (UniqueName: \"kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps\") pod \"auto-csr-approver-29561838-89zzc\" (UID: \"a64208ce-d13d-4744-827f-c5ab7b9ffc6e\") " pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.510502 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:00 crc kubenswrapper[4755]: I0317 01:18:00.991720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-89zzc"] Mar 17 01:18:01 crc kubenswrapper[4755]: W0317 01:18:01.008430 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64208ce_d13d_4744_827f_c5ab7b9ffc6e.slice/crio-bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa WatchSource:0}: Error finding container bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa: Status 404 returned error can't find the container with id bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa Mar 17 01:18:01 crc kubenswrapper[4755]: I0317 01:18:01.561752 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-89zzc" event={"ID":"a64208ce-d13d-4744-827f-c5ab7b9ffc6e","Type":"ContainerStarted","Data":"bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa"} Mar 17 01:18:02 crc kubenswrapper[4755]: I0317 01:18:02.572156 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-89zzc" event={"ID":"a64208ce-d13d-4744-827f-c5ab7b9ffc6e","Type":"ContainerStarted","Data":"8b63c777b499d558baa5feae5f355acb96d1435d3267b7c21d51c8d16fb4c1bc"} Mar 17 01:18:02 crc kubenswrapper[4755]: I0317 01:18:02.594975 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561838-89zzc" podStartSLOduration=1.6279741159999999 podStartE2EDuration="2.59495766s" podCreationTimestamp="2026-03-17 01:18:00 +0000 UTC" firstStartedPulling="2026-03-17 01:18:01.020324397 +0000 UTC m=+3355.779776680" lastFinishedPulling="2026-03-17 01:18:01.987307901 +0000 UTC m=+3356.746760224" observedRunningTime="2026-03-17 01:18:02.58646392 +0000 UTC m=+3357.345916213" watchObservedRunningTime="2026-03-17 01:18:02.59495766 +0000 UTC m=+3357.354409943" Mar 17 01:18:03 crc kubenswrapper[4755]: I0317 01:18:03.586096 4755 generic.go:334] "Generic (PLEG): container finished" podID="a64208ce-d13d-4744-827f-c5ab7b9ffc6e" containerID="8b63c777b499d558baa5feae5f355acb96d1435d3267b7c21d51c8d16fb4c1bc" exitCode=0 Mar 17 01:18:03 crc kubenswrapper[4755]: I0317 01:18:03.586157 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-89zzc" event={"ID":"a64208ce-d13d-4744-827f-c5ab7b9ffc6e","Type":"ContainerDied","Data":"8b63c777b499d558baa5feae5f355acb96d1435d3267b7c21d51c8d16fb4c1bc"} Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.052617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.149378 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4cps\" (UniqueName: \"kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps\") pod \"a64208ce-d13d-4744-827f-c5ab7b9ffc6e\" (UID: \"a64208ce-d13d-4744-827f-c5ab7b9ffc6e\") " Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.159182 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps" (OuterVolumeSpecName: "kube-api-access-x4cps") pod "a64208ce-d13d-4744-827f-c5ab7b9ffc6e" (UID: "a64208ce-d13d-4744-827f-c5ab7b9ffc6e"). InnerVolumeSpecName "kube-api-access-x4cps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.253973 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4cps\" (UniqueName: \"kubernetes.io/projected/a64208ce-d13d-4744-827f-c5ab7b9ffc6e-kube-api-access-x4cps\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.612240 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561838-89zzc" event={"ID":"a64208ce-d13d-4744-827f-c5ab7b9ffc6e","Type":"ContainerDied","Data":"bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa"} Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.612379 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561838-89zzc" Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.612410 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda12b8cfefc33050e0cbd9a41c0dcb3dc8135a76a6de27c1068231c44591ffa" Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.693910 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-vd84x"] Mar 17 01:18:05 crc kubenswrapper[4755]: I0317 01:18:05.705791 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561832-vd84x"] Mar 17 01:18:06 crc kubenswrapper[4755]: I0317 01:18:06.269957 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1dcd54-6ffe-43bb-8676-fc0b90687777" path="/var/lib/kubelet/pods/5e1dcd54-6ffe-43bb-8676-fc0b90687777/volumes" Mar 17 01:18:09 crc kubenswrapper[4755]: I0317 01:18:09.250076 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:18:09 crc kubenswrapper[4755]: E0317 01:18:09.251093 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:18:23 crc kubenswrapper[4755]: I0317 01:18:23.248756 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:18:23 crc kubenswrapper[4755]: E0317 01:18:23.249769 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:18:28 crc kubenswrapper[4755]: I0317 01:18:28.892898 4755 generic.go:334] "Generic (PLEG): container finished" podID="351689ec-5f29-4144-ab28-25abac57ccac" containerID="a81003f72ae7f759f41cc81c28d34910bd7f448fcbf5460182f59a4579dcc6b6" exitCode=0 Mar 17 01:18:28 crc kubenswrapper[4755]: I0317 01:18:28.893025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" event={"ID":"351689ec-5f29-4144-ab28-25abac57ccac","Type":"ContainerDied","Data":"a81003f72ae7f759f41cc81c28d34910bd7f448fcbf5460182f59a4579dcc6b6"} Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.527455 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.682102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam\") pod \"351689ec-5f29-4144-ab28-25abac57ccac\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.682231 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph\") pod \"351689ec-5f29-4144-ab28-25abac57ccac\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.682324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwgbc\" (UniqueName: \"kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc\") pod \"351689ec-5f29-4144-ab28-25abac57ccac\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.682354 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory\") pod \"351689ec-5f29-4144-ab28-25abac57ccac\" (UID: \"351689ec-5f29-4144-ab28-25abac57ccac\") " Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.690423 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc" (OuterVolumeSpecName: "kube-api-access-mwgbc") pod "351689ec-5f29-4144-ab28-25abac57ccac" (UID: "351689ec-5f29-4144-ab28-25abac57ccac"). InnerVolumeSpecName "kube-api-access-mwgbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.692652 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph" (OuterVolumeSpecName: "ceph") pod "351689ec-5f29-4144-ab28-25abac57ccac" (UID: "351689ec-5f29-4144-ab28-25abac57ccac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.726003 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "351689ec-5f29-4144-ab28-25abac57ccac" (UID: "351689ec-5f29-4144-ab28-25abac57ccac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.746745 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory" (OuterVolumeSpecName: "inventory") pod "351689ec-5f29-4144-ab28-25abac57ccac" (UID: "351689ec-5f29-4144-ab28-25abac57ccac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.786799 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.786867 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwgbc\" (UniqueName: \"kubernetes.io/projected/351689ec-5f29-4144-ab28-25abac57ccac-kube-api-access-mwgbc\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.786895 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.786917 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/351689ec-5f29-4144-ab28-25abac57ccac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.925847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" event={"ID":"351689ec-5f29-4144-ab28-25abac57ccac","Type":"ContainerDied","Data":"b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9"} Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.925927 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c849e36a84fbe9584c3865dfcbb36da6c86f9e85892b03d4755349ec1dc1e9" Mar 17 01:18:30 crc kubenswrapper[4755]: I0317 01:18:30.925937 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.058224 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dbdqx"] Mar 17 01:18:31 crc kubenswrapper[4755]: E0317 01:18:31.058763 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64208ce-d13d-4744-827f-c5ab7b9ffc6e" containerName="oc" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.058783 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64208ce-d13d-4744-827f-c5ab7b9ffc6e" containerName="oc" Mar 17 01:18:31 crc kubenswrapper[4755]: E0317 01:18:31.058807 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351689ec-5f29-4144-ab28-25abac57ccac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.058816 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="351689ec-5f29-4144-ab28-25abac57ccac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.059057 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64208ce-d13d-4744-827f-c5ab7b9ffc6e" containerName="oc" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.059092 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="351689ec-5f29-4144-ab28-25abac57ccac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.059942 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.063031 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.063420 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.063751 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.065404 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.065830 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.082545 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dbdqx"] Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.198244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.198683 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.198714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.198773 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxht\" (UniqueName: \"kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.302507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.302846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.302898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.302995 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxht\" (UniqueName: \"kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.310576 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.310858 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.314208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.328740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxht\" (UniqueName: \"kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht\") pod \"ssh-known-hosts-edpm-deployment-dbdqx\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.397867 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:31 crc kubenswrapper[4755]: I0317 01:18:31.997008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dbdqx"] Mar 17 01:18:32 crc kubenswrapper[4755]: I0317 01:18:32.006086 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:18:32 crc kubenswrapper[4755]: I0317 01:18:32.961634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" event={"ID":"96c8e866-f764-4b94-b980-7b007ba5411c","Type":"ContainerStarted","Data":"fd6e9de82abe45b5cd882ff1cb599620e80ac49ac26d47f9dc330b818ce38164"} Mar 17 01:18:32 crc kubenswrapper[4755]: I0317 01:18:32.962005 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" event={"ID":"96c8e866-f764-4b94-b980-7b007ba5411c","Type":"ContainerStarted","Data":"6484159d8adddd1784265bb1c57cff80045923e6cc4b963da86d40b015d25626"} Mar 17 01:18:33 crc kubenswrapper[4755]: I0317 01:18:33.015965 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" podStartSLOduration=1.488542563 podStartE2EDuration="2.015944461s" podCreationTimestamp="2026-03-17 01:18:31 +0000 UTC" firstStartedPulling="2026-03-17 01:18:32.005895034 +0000 UTC m=+3386.765347317" lastFinishedPulling="2026-03-17 01:18:32.533296932 +0000 UTC m=+3387.292749215" observedRunningTime="2026-03-17 01:18:32.988557521 +0000 UTC m=+3387.748009844" watchObservedRunningTime="2026-03-17 01:18:33.015944461 +0000 UTC m=+3387.775396744" Mar 17 01:18:36 crc kubenswrapper[4755]: I0317 01:18:36.262368 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:18:37 crc kubenswrapper[4755]: I0317 01:18:37.025915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c"} Mar 17 01:18:44 crc kubenswrapper[4755]: I0317 01:18:44.126993 4755 generic.go:334] "Generic (PLEG): container finished" podID="96c8e866-f764-4b94-b980-7b007ba5411c" containerID="fd6e9de82abe45b5cd882ff1cb599620e80ac49ac26d47f9dc330b818ce38164" exitCode=0 Mar 17 01:18:44 crc kubenswrapper[4755]: I0317 01:18:44.127136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" event={"ID":"96c8e866-f764-4b94-b980-7b007ba5411c","Type":"ContainerDied","Data":"fd6e9de82abe45b5cd882ff1cb599620e80ac49ac26d47f9dc330b818ce38164"} Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.772006 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.879354 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksxht\" (UniqueName: \"kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht\") pod \"96c8e866-f764-4b94-b980-7b007ba5411c\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.879659 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam\") pod \"96c8e866-f764-4b94-b980-7b007ba5411c\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.879912 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph\") pod \"96c8e866-f764-4b94-b980-7b007ba5411c\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.879982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0\") pod \"96c8e866-f764-4b94-b980-7b007ba5411c\" (UID: \"96c8e866-f764-4b94-b980-7b007ba5411c\") " Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.889967 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph" (OuterVolumeSpecName: "ceph") pod "96c8e866-f764-4b94-b980-7b007ba5411c" (UID: "96c8e866-f764-4b94-b980-7b007ba5411c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.890048 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht" (OuterVolumeSpecName: "kube-api-access-ksxht") pod "96c8e866-f764-4b94-b980-7b007ba5411c" (UID: "96c8e866-f764-4b94-b980-7b007ba5411c"). InnerVolumeSpecName "kube-api-access-ksxht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.919753 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "96c8e866-f764-4b94-b980-7b007ba5411c" (UID: "96c8e866-f764-4b94-b980-7b007ba5411c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.928682 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "96c8e866-f764-4b94-b980-7b007ba5411c" (UID: "96c8e866-f764-4b94-b980-7b007ba5411c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.983760 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.983863 4755 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.983884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksxht\" (UniqueName: \"kubernetes.io/projected/96c8e866-f764-4b94-b980-7b007ba5411c-kube-api-access-ksxht\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:45 crc kubenswrapper[4755]: I0317 01:18:45.983940 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96c8e866-f764-4b94-b980-7b007ba5411c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.160053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" event={"ID":"96c8e866-f764-4b94-b980-7b007ba5411c","Type":"ContainerDied","Data":"6484159d8adddd1784265bb1c57cff80045923e6cc4b963da86d40b015d25626"} Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.160597 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6484159d8adddd1784265bb1c57cff80045923e6cc4b963da86d40b015d25626" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.160132 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dbdqx" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.287946 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg"] Mar 17 01:18:46 crc kubenswrapper[4755]: E0317 01:18:46.288548 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c8e866-f764-4b94-b980-7b007ba5411c" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.288576 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c8e866-f764-4b94-b980-7b007ba5411c" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.288927 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c8e866-f764-4b94-b980-7b007ba5411c" containerName="ssh-known-hosts-edpm-deployment" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.289715 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg"] Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.289802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.295347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.295429 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.295495 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.295953 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.296545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvptp\" (UniqueName: \"kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.296610 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.296641 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.297078 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.299411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.401362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.401470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvptp\" (UniqueName: \"kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.401514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.401545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.406216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.408946 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.415542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.422481 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvptp\" (UniqueName: \"kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dkvkg\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.624678 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.718485 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.721946 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.755565 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.821851 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.821961 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqv8\" (UniqueName: \"kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.822010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.922963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.923057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqv8\" (UniqueName: \"kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.923093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.923570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.923613 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:46 crc kubenswrapper[4755]: I0317 01:18:46.948104 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqv8\" (UniqueName: \"kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8\") pod \"redhat-marketplace-wgpws\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:47 crc kubenswrapper[4755]: I0317 01:18:47.096049 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:47 crc kubenswrapper[4755]: I0317 01:18:47.280962 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg"] Mar 17 01:18:47 crc kubenswrapper[4755]: I0317 01:18:47.693900 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:18:47 crc kubenswrapper[4755]: W0317 01:18:47.703912 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c8d15b8_425d_420e_917d_29e9eda77b9a.slice/crio-53e73526110a9bee2f4b76d93246dc474e01769905b122037e77abf6af2d3e1d WatchSource:0}: Error finding container 53e73526110a9bee2f4b76d93246dc474e01769905b122037e77abf6af2d3e1d: Status 404 returned error can't find the container with id 53e73526110a9bee2f4b76d93246dc474e01769905b122037e77abf6af2d3e1d Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.203881 4755 generic.go:334] "Generic (PLEG): container finished" podID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerID="b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b" exitCode=0 Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.203950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerDied","Data":"b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b"} Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.204228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerStarted","Data":"53e73526110a9bee2f4b76d93246dc474e01769905b122037e77abf6af2d3e1d"} Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.207031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" event={"ID":"d3df52cf-6c5b-4e10-b055-d00d52e09156","Type":"ContainerStarted","Data":"68706a08ae5318d03b54144d11e5a248c61791ed23281c11bcee1dbb6788e050"} Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.207089 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" event={"ID":"d3df52cf-6c5b-4e10-b055-d00d52e09156","Type":"ContainerStarted","Data":"c6a2947d57b16c5c50310556706c45b83eb4c2cc4989ccca2b0b7cc8cc136164"} Mar 17 01:18:48 crc kubenswrapper[4755]: I0317 01:18:48.244618 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" podStartSLOduration=1.847927796 podStartE2EDuration="2.24459917s" podCreationTimestamp="2026-03-17 01:18:46 +0000 UTC" firstStartedPulling="2026-03-17 01:18:47.287064061 +0000 UTC m=+3402.046516344" lastFinishedPulling="2026-03-17 01:18:47.683735425 +0000 UTC m=+3402.443187718" observedRunningTime="2026-03-17 01:18:48.239509462 +0000 UTC m=+3402.998961745" watchObservedRunningTime="2026-03-17 01:18:48.24459917 +0000 UTC m=+3403.004051453" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.225031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerStarted","Data":"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941"} Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.307066 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.312629 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.326109 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.383389 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.383699 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.384057 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpkjt\" (UniqueName: \"kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.485647 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpkjt\" (UniqueName: \"kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.485749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.485823 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.486351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.486381 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.531849 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpkjt\" (UniqueName: \"kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt\") pod \"certified-operators-4tzp9\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:49 crc kubenswrapper[4755]: I0317 01:18:49.631006 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:50 crc kubenswrapper[4755]: I0317 01:18:50.130571 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:18:50 crc kubenswrapper[4755]: I0317 01:18:50.236009 4755 generic.go:334] "Generic (PLEG): container finished" podID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerID="2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941" exitCode=0 Mar 17 01:18:50 crc kubenswrapper[4755]: I0317 01:18:50.236120 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerDied","Data":"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941"} Mar 17 01:18:50 crc kubenswrapper[4755]: I0317 01:18:50.237537 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerStarted","Data":"64f4d88fa2c1d2048db320dc0cd6ef17f57bffee5177b91383a568d6c7ea757a"} Mar 17 01:18:51 crc kubenswrapper[4755]: I0317 01:18:51.255112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerStarted","Data":"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840"} Mar 17 01:18:51 crc kubenswrapper[4755]: I0317 01:18:51.260130 4755 generic.go:334] "Generic (PLEG): container finished" podID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerID="eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa" exitCode=0 Mar 17 01:18:51 crc kubenswrapper[4755]: I0317 01:18:51.260181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerDied","Data":"eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa"} Mar 17 01:18:51 crc kubenswrapper[4755]: I0317 01:18:51.286250 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wgpws" podStartSLOduration=2.8810407270000002 podStartE2EDuration="5.286229124s" podCreationTimestamp="2026-03-17 01:18:46 +0000 UTC" firstStartedPulling="2026-03-17 01:18:48.206313824 +0000 UTC m=+3402.965766107" lastFinishedPulling="2026-03-17 01:18:50.611502221 +0000 UTC m=+3405.370954504" observedRunningTime="2026-03-17 01:18:51.278637068 +0000 UTC m=+3406.038089391" watchObservedRunningTime="2026-03-17 01:18:51.286229124 +0000 UTC m=+3406.045681417" Mar 17 01:18:53 crc kubenswrapper[4755]: I0317 01:18:53.288271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerStarted","Data":"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38"} Mar 17 01:18:55 crc kubenswrapper[4755]: I0317 01:18:55.313666 4755 generic.go:334] "Generic (PLEG): container finished" podID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerID="6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38" exitCode=0 Mar 17 01:18:55 crc kubenswrapper[4755]: I0317 01:18:55.313731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerDied","Data":"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38"} Mar 17 01:18:56 crc kubenswrapper[4755]: I0317 01:18:56.332173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerStarted","Data":"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd"} Mar 17 01:18:56 crc kubenswrapper[4755]: I0317 01:18:56.371107 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4tzp9" podStartSLOduration=2.882110275 podStartE2EDuration="7.37107605s" podCreationTimestamp="2026-03-17 01:18:49 +0000 UTC" firstStartedPulling="2026-03-17 01:18:51.265640627 +0000 UTC m=+3406.025092950" lastFinishedPulling="2026-03-17 01:18:55.754606402 +0000 UTC m=+3410.514058725" observedRunningTime="2026-03-17 01:18:56.361178481 +0000 UTC m=+3411.120630774" watchObservedRunningTime="2026-03-17 01:18:56.37107605 +0000 UTC m=+3411.130528353" Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.097201 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.097603 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.190238 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.346622 4755 generic.go:334] "Generic (PLEG): container finished" podID="d3df52cf-6c5b-4e10-b055-d00d52e09156" containerID="68706a08ae5318d03b54144d11e5a248c61791ed23281c11bcee1dbb6788e050" exitCode=0 Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.346763 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" event={"ID":"d3df52cf-6c5b-4e10-b055-d00d52e09156","Type":"ContainerDied","Data":"68706a08ae5318d03b54144d11e5a248c61791ed23281c11bcee1dbb6788e050"} Mar 17 01:18:57 crc kubenswrapper[4755]: I0317 01:18:57.442275 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:18:58 crc kubenswrapper[4755]: I0317 01:18:58.844969 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.022050 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvptp\" (UniqueName: \"kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp\") pod \"d3df52cf-6c5b-4e10-b055-d00d52e09156\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.022263 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph\") pod \"d3df52cf-6c5b-4e10-b055-d00d52e09156\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.023304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam\") pod \"d3df52cf-6c5b-4e10-b055-d00d52e09156\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.023337 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory\") pod \"d3df52cf-6c5b-4e10-b055-d00d52e09156\" (UID: \"d3df52cf-6c5b-4e10-b055-d00d52e09156\") " Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.030697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph" (OuterVolumeSpecName: "ceph") pod "d3df52cf-6c5b-4e10-b055-d00d52e09156" (UID: "d3df52cf-6c5b-4e10-b055-d00d52e09156"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.030956 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp" (OuterVolumeSpecName: "kube-api-access-tvptp") pod "d3df52cf-6c5b-4e10-b055-d00d52e09156" (UID: "d3df52cf-6c5b-4e10-b055-d00d52e09156"). InnerVolumeSpecName "kube-api-access-tvptp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.053598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory" (OuterVolumeSpecName: "inventory") pod "d3df52cf-6c5b-4e10-b055-d00d52e09156" (UID: "d3df52cf-6c5b-4e10-b055-d00d52e09156"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.059454 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3df52cf-6c5b-4e10-b055-d00d52e09156" (UID: "d3df52cf-6c5b-4e10-b055-d00d52e09156"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.126121 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvptp\" (UniqueName: \"kubernetes.io/projected/d3df52cf-6c5b-4e10-b055-d00d52e09156-kube-api-access-tvptp\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.126153 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.126163 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.126172 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3df52cf-6c5b-4e10-b055-d00d52e09156-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.292998 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.376398 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.376418 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dkvkg" event={"ID":"d3df52cf-6c5b-4e10-b055-d00d52e09156","Type":"ContainerDied","Data":"c6a2947d57b16c5c50310556706c45b83eb4c2cc4989ccca2b0b7cc8cc136164"} Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.376518 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a2947d57b16c5c50310556706c45b83eb4c2cc4989ccca2b0b7cc8cc136164" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.376678 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wgpws" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="registry-server" containerID="cri-o://43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840" gracePeriod=2 Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.499259 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w"] Mar 17 01:18:59 crc kubenswrapper[4755]: E0317 01:18:59.499878 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3df52cf-6c5b-4e10-b055-d00d52e09156" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.499902 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3df52cf-6c5b-4e10-b055-d00d52e09156" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.500370 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3df52cf-6c5b-4e10-b055-d00d52e09156" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.501498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.506072 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.506363 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.506525 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.506986 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.507128 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.531936 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w"] Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.632075 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.633217 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.644469 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.645296 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.645711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.645916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cvm\" (UniqueName: \"kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.747898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.748215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cvm\" (UniqueName: \"kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.748279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.748338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.752030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.752082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.752468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.764512 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cvm\" (UniqueName: \"kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:18:59 crc kubenswrapper[4755]: I0317 01:18:59.862878 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:18:59.999696 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.089598 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content\") pod \"2c8d15b8-425d-420e-917d-29e9eda77b9a\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.089773 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities\") pod \"2c8d15b8-425d-420e-917d-29e9eda77b9a\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.089852 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vqv8\" (UniqueName: \"kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8\") pod \"2c8d15b8-425d-420e-917d-29e9eda77b9a\" (UID: \"2c8d15b8-425d-420e-917d-29e9eda77b9a\") " Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.090683 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities" (OuterVolumeSpecName: "utilities") pod "2c8d15b8-425d-420e-917d-29e9eda77b9a" (UID: "2c8d15b8-425d-420e-917d-29e9eda77b9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.095780 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8" (OuterVolumeSpecName: "kube-api-access-2vqv8") pod "2c8d15b8-425d-420e-917d-29e9eda77b9a" (UID: "2c8d15b8-425d-420e-917d-29e9eda77b9a"). InnerVolumeSpecName "kube-api-access-2vqv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.120415 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c8d15b8-425d-420e-917d-29e9eda77b9a" (UID: "2c8d15b8-425d-420e-917d-29e9eda77b9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.191988 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.192024 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d15b8-425d-420e-917d-29e9eda77b9a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.192036 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vqv8\" (UniqueName: \"kubernetes.io/projected/2c8d15b8-425d-420e-917d-29e9eda77b9a-kube-api-access-2vqv8\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.391260 4755 generic.go:334] "Generic (PLEG): container finished" podID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerID="43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840" exitCode=0 Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.391331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerDied","Data":"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840"} Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.391390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgpws" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.391418 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgpws" event={"ID":"2c8d15b8-425d-420e-917d-29e9eda77b9a","Type":"ContainerDied","Data":"53e73526110a9bee2f4b76d93246dc474e01769905b122037e77abf6af2d3e1d"} Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.391487 4755 scope.go:117] "RemoveContainer" containerID="43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.436347 4755 scope.go:117] "RemoveContainer" containerID="2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941" Mar 17 01:19:00 crc kubenswrapper[4755]: W0317 01:19:00.443631 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod522fd7b5_ad67_4bb9_815e_239ab63e78c9.slice/crio-bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798 WatchSource:0}: Error finding container bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798: Status 404 returned error can't find the container with id bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798 Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.456531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w"] Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.468001 4755 scope.go:117] "RemoveContainer" containerID="b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.474583 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.485491 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgpws"] Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.488558 4755 scope.go:117] "RemoveContainer" containerID="43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840" Mar 17 01:19:00 crc kubenswrapper[4755]: E0317 01:19:00.489074 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840\": container with ID starting with 43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840 not found: ID does not exist" containerID="43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.489117 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840"} err="failed to get container status \"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840\": rpc error: code = NotFound desc = could not find container \"43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840\": container with ID starting with 43922e618b1e4268e9034d254357d05d48a6f372c59fb5f2251b3de15061b840 not found: ID does not exist" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.489146 4755 scope.go:117] "RemoveContainer" containerID="2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941" Mar 17 01:19:00 crc kubenswrapper[4755]: E0317 01:19:00.489573 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941\": container with ID starting with 2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941 not found: ID does not exist" containerID="2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.489608 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941"} err="failed to get container status \"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941\": rpc error: code = NotFound desc = could not find container \"2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941\": container with ID starting with 2bd01a48a839664e923fbb5353b2fcb57fe2b786e0f6f058b24befc93d3c6941 not found: ID does not exist" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.489651 4755 scope.go:117] "RemoveContainer" containerID="b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b" Mar 17 01:19:00 crc kubenswrapper[4755]: E0317 01:19:00.489988 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b\": container with ID starting with b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b not found: ID does not exist" containerID="b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.490007 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b"} err="failed to get container status \"b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b\": rpc error: code = NotFound desc = could not find container \"b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b\": container with ID starting with b8422c8e9ce36d53f72e8820eac29d2835152df874c8a64e202b40ec08a33f2b not found: ID does not exist" Mar 17 01:19:00 crc kubenswrapper[4755]: I0317 01:19:00.679046 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4tzp9" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="registry-server" probeResult="failure" output=< Mar 17 01:19:00 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:19:00 crc kubenswrapper[4755]: > Mar 17 01:19:01 crc kubenswrapper[4755]: I0317 01:19:01.401784 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" event={"ID":"522fd7b5-ad67-4bb9-815e-239ab63e78c9","Type":"ContainerStarted","Data":"773b566f0dee772d8e47b0cb81b04fe3cbb4155fddb50cbd01803086c276218b"} Mar 17 01:19:01 crc kubenswrapper[4755]: I0317 01:19:01.402378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" event={"ID":"522fd7b5-ad67-4bb9-815e-239ab63e78c9","Type":"ContainerStarted","Data":"bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798"} Mar 17 01:19:01 crc kubenswrapper[4755]: I0317 01:19:01.430761 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" podStartSLOduration=2.01453268 podStartE2EDuration="2.430740154s" podCreationTimestamp="2026-03-17 01:18:59 +0000 UTC" firstStartedPulling="2026-03-17 01:19:00.449507425 +0000 UTC m=+3415.208959708" lastFinishedPulling="2026-03-17 01:19:00.865714889 +0000 UTC m=+3415.625167182" observedRunningTime="2026-03-17 01:19:01.41870526 +0000 UTC m=+3416.178157553" watchObservedRunningTime="2026-03-17 01:19:01.430740154 +0000 UTC m=+3416.190192437" Mar 17 01:19:02 crc kubenswrapper[4755]: I0317 01:19:02.263495 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" path="/var/lib/kubelet/pods/2c8d15b8-425d-420e-917d-29e9eda77b9a/volumes" Mar 17 01:19:02 crc kubenswrapper[4755]: I0317 01:19:02.729669 4755 scope.go:117] "RemoveContainer" containerID="7ff0428730410957083750a6674ff851d2d4e0e7fb5eaee1188b2154838b0dc8" Mar 17 01:19:09 crc kubenswrapper[4755]: I0317 01:19:09.727525 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:19:09 crc kubenswrapper[4755]: I0317 01:19:09.820980 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:19:09 crc kubenswrapper[4755]: I0317 01:19:09.980774 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:19:11 crc kubenswrapper[4755]: I0317 01:19:11.528936 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4tzp9" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="registry-server" containerID="cri-o://300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd" gracePeriod=2 Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.069490 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.192297 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpkjt\" (UniqueName: \"kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt\") pod \"62cdf227-16c3-485b-a6b5-481ff7e8a329\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.192543 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content\") pod \"62cdf227-16c3-485b-a6b5-481ff7e8a329\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.192598 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities\") pod \"62cdf227-16c3-485b-a6b5-481ff7e8a329\" (UID: \"62cdf227-16c3-485b-a6b5-481ff7e8a329\") " Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.193705 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities" (OuterVolumeSpecName: "utilities") pod "62cdf227-16c3-485b-a6b5-481ff7e8a329" (UID: "62cdf227-16c3-485b-a6b5-481ff7e8a329"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.201985 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt" (OuterVolumeSpecName: "kube-api-access-lpkjt") pod "62cdf227-16c3-485b-a6b5-481ff7e8a329" (UID: "62cdf227-16c3-485b-a6b5-481ff7e8a329"). InnerVolumeSpecName "kube-api-access-lpkjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.253805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62cdf227-16c3-485b-a6b5-481ff7e8a329" (UID: "62cdf227-16c3-485b-a6b5-481ff7e8a329"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.295095 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.295123 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62cdf227-16c3-485b-a6b5-481ff7e8a329-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.295134 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpkjt\" (UniqueName: \"kubernetes.io/projected/62cdf227-16c3-485b-a6b5-481ff7e8a329-kube-api-access-lpkjt\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.540905 4755 generic.go:334] "Generic (PLEG): container finished" podID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerID="300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd" exitCode=0 Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.540969 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tzp9" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.540967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerDied","Data":"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd"} Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.541149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tzp9" event={"ID":"62cdf227-16c3-485b-a6b5-481ff7e8a329","Type":"ContainerDied","Data":"64f4d88fa2c1d2048db320dc0cd6ef17f57bffee5177b91383a568d6c7ea757a"} Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.541205 4755 scope.go:117] "RemoveContainer" containerID="300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.587082 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.593512 4755 scope.go:117] "RemoveContainer" containerID="6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.606340 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4tzp9"] Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.624216 4755 scope.go:117] "RemoveContainer" containerID="eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.688578 4755 scope.go:117] "RemoveContainer" containerID="300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd" Mar 17 01:19:12 crc kubenswrapper[4755]: E0317 01:19:12.689638 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd\": container with ID starting with 300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd not found: ID does not exist" containerID="300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.689716 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd"} err="failed to get container status \"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd\": rpc error: code = NotFound desc = could not find container \"300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd\": container with ID starting with 300bc847f24e5e584b0637b32286c8f9f61c4e9ff3b02099e134503be571febd not found: ID does not exist" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.689761 4755 scope.go:117] "RemoveContainer" containerID="6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38" Mar 17 01:19:12 crc kubenswrapper[4755]: E0317 01:19:12.690318 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38\": container with ID starting with 6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38 not found: ID does not exist" containerID="6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.690368 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38"} err="failed to get container status \"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38\": rpc error: code = NotFound desc = could not find container \"6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38\": container with ID starting with 6a01aff0ecbaa21f8f9bc0a047340f815fa08fa92267d0fcdff8cd5f74549f38 not found: ID does not exist" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.690422 4755 scope.go:117] "RemoveContainer" containerID="eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa" Mar 17 01:19:12 crc kubenswrapper[4755]: E0317 01:19:12.690910 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa\": container with ID starting with eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa not found: ID does not exist" containerID="eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa" Mar 17 01:19:12 crc kubenswrapper[4755]: I0317 01:19:12.690953 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa"} err="failed to get container status \"eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa\": rpc error: code = NotFound desc = could not find container \"eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa\": container with ID starting with eb29b834129abb2300360b0708e327e5df4bdf836dff4d3e37f5655ae95a97fa not found: ID does not exist" Mar 17 01:19:13 crc kubenswrapper[4755]: I0317 01:19:13.555185 4755 generic.go:334] "Generic (PLEG): container finished" podID="522fd7b5-ad67-4bb9-815e-239ab63e78c9" containerID="773b566f0dee772d8e47b0cb81b04fe3cbb4155fddb50cbd01803086c276218b" exitCode=0 Mar 17 01:19:13 crc kubenswrapper[4755]: I0317 01:19:13.555316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" event={"ID":"522fd7b5-ad67-4bb9-815e-239ab63e78c9","Type":"ContainerDied","Data":"773b566f0dee772d8e47b0cb81b04fe3cbb4155fddb50cbd01803086c276218b"} Mar 17 01:19:14 crc kubenswrapper[4755]: I0317 01:19:14.264047 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" path="/var/lib/kubelet/pods/62cdf227-16c3-485b-a6b5-481ff7e8a329/volumes" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.161696 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.267391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56cvm\" (UniqueName: \"kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm\") pod \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.267623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam\") pod \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.267665 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph\") pod \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.267714 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory\") pod \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\" (UID: \"522fd7b5-ad67-4bb9-815e-239ab63e78c9\") " Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.273849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph" (OuterVolumeSpecName: "ceph") pod "522fd7b5-ad67-4bb9-815e-239ab63e78c9" (UID: "522fd7b5-ad67-4bb9-815e-239ab63e78c9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.277769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm" (OuterVolumeSpecName: "kube-api-access-56cvm") pod "522fd7b5-ad67-4bb9-815e-239ab63e78c9" (UID: "522fd7b5-ad67-4bb9-815e-239ab63e78c9"). InnerVolumeSpecName "kube-api-access-56cvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.303922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "522fd7b5-ad67-4bb9-815e-239ab63e78c9" (UID: "522fd7b5-ad67-4bb9-815e-239ab63e78c9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.310711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory" (OuterVolumeSpecName: "inventory") pod "522fd7b5-ad67-4bb9-815e-239ab63e78c9" (UID: "522fd7b5-ad67-4bb9-815e-239ab63e78c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.372067 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56cvm\" (UniqueName: \"kubernetes.io/projected/522fd7b5-ad67-4bb9-815e-239ab63e78c9-kube-api-access-56cvm\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.372096 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.372106 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.372115 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522fd7b5-ad67-4bb9-815e-239ab63e78c9-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.588166 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" event={"ID":"522fd7b5-ad67-4bb9-815e-239ab63e78c9","Type":"ContainerDied","Data":"bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798"} Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.588210 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf13f86073128f4f51011d160a4d6c6dcc33c64f6a2003f22933d049b74a4798" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.588256 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809290 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz"] Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809770 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="extract-utilities" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809795 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="extract-utilities" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809830 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809839 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809852 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="extract-content" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809860 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="extract-content" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809876 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="extract-content" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809882 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="extract-content" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809895 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="extract-utilities" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809901 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="extract-utilities" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809914 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809921 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: E0317 01:19:15.809959 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522fd7b5-ad67-4bb9-815e-239ab63e78c9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.809969 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="522fd7b5-ad67-4bb9-815e-239ab63e78c9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.810179 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="522fd7b5-ad67-4bb9-815e-239ab63e78c9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.810204 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8d15b8-425d-420e-917d-29e9eda77b9a" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.810216 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="62cdf227-16c3-485b-a6b5-481ff7e8a329" containerName="registry-server" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.811089 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.813599 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.814798 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.815149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.815408 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.816313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.816665 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.816766 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.816864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.817048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.817188 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.834038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz"] Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984851 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.984879 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985128 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985542 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985675 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4scwt\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:15 crc kubenswrapper[4755]: I0317 01:19:15.985918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.089240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090005 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090275 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090422 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.090990 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.091141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.091240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.091488 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.091569 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4scwt\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.091688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.102074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.102592 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.104305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.105068 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.105355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.105711 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.106389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.106879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.107225 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.108268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.108803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.109782 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.110024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.111795 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.113029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.116847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4scwt\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.125314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.133746 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:19:16 crc kubenswrapper[4755]: I0317 01:19:16.868728 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz"] Mar 17 01:19:17 crc kubenswrapper[4755]: I0317 01:19:17.636134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" event={"ID":"768f1228-6ea3-4601-a0e4-93911d1d4fa1","Type":"ContainerStarted","Data":"745a43f2038fd7f3cfa34d86327d98e2dbf528262d241ca4052980491d5384b5"} Mar 17 01:19:17 crc kubenswrapper[4755]: I0317 01:19:17.636893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" event={"ID":"768f1228-6ea3-4601-a0e4-93911d1d4fa1","Type":"ContainerStarted","Data":"0f979987ef5813967c46f7c4e67b103f6a73e368ecaed5165bcadb8af01d87e1"} Mar 17 01:19:17 crc kubenswrapper[4755]: I0317 01:19:17.673674 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" podStartSLOduration=2.251087565 podStartE2EDuration="2.673643929s" podCreationTimestamp="2026-03-17 01:19:15 +0000 UTC" firstStartedPulling="2026-03-17 01:19:16.873179353 +0000 UTC m=+3431.632631656" lastFinishedPulling="2026-03-17 01:19:17.295735727 +0000 UTC m=+3432.055188020" observedRunningTime="2026-03-17 01:19:17.664987157 +0000 UTC m=+3432.424439540" watchObservedRunningTime="2026-03-17 01:19:17.673643929 +0000 UTC m=+3432.433096262" Mar 17 01:19:27 crc kubenswrapper[4755]: I0317 01:19:27.416709 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5b9b5bb667-6pk7q" podUID="cfa93106-8e0c-4e7d-93cf-33d06c85d883" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.138998 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561840-ljcjf"] Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.141098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.142850 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.143117 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.143140 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.147553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-ljcjf"] Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.188629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xm4\" (UniqueName: \"kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4\") pod \"auto-csr-approver-29561840-ljcjf\" (UID: \"48f546fc-7c8e-42f5-b540-ac8597ae0e4e\") " pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.290453 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xm4\" (UniqueName: \"kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4\") pod \"auto-csr-approver-29561840-ljcjf\" (UID: \"48f546fc-7c8e-42f5-b540-ac8597ae0e4e\") " pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.319237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xm4\" (UniqueName: \"kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4\") pod \"auto-csr-approver-29561840-ljcjf\" (UID: \"48f546fc-7c8e-42f5-b540-ac8597ae0e4e\") " pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:00 crc kubenswrapper[4755]: I0317 01:20:00.461134 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:01 crc kubenswrapper[4755]: I0317 01:20:01.056023 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-ljcjf"] Mar 17 01:20:01 crc kubenswrapper[4755]: I0317 01:20:01.142941 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" event={"ID":"48f546fc-7c8e-42f5-b540-ac8597ae0e4e","Type":"ContainerStarted","Data":"3e3205cff2849d27285bc8921410210261ff06d4cc0cde06f0bfbd35d76f7722"} Mar 17 01:20:03 crc kubenswrapper[4755]: I0317 01:20:03.179200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" event={"ID":"48f546fc-7c8e-42f5-b540-ac8597ae0e4e","Type":"ContainerStarted","Data":"afce1ae55580b885a4da8cb06b97ad257e835f964788876c444812bbb5bf958d"} Mar 17 01:20:03 crc kubenswrapper[4755]: I0317 01:20:03.213359 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" podStartSLOduration=1.526502551 podStartE2EDuration="3.213320868s" podCreationTimestamp="2026-03-17 01:20:00 +0000 UTC" firstStartedPulling="2026-03-17 01:20:01.053325264 +0000 UTC m=+3475.812777587" lastFinishedPulling="2026-03-17 01:20:02.740143621 +0000 UTC m=+3477.499595904" observedRunningTime="2026-03-17 01:20:03.203786604 +0000 UTC m=+3477.963238927" watchObservedRunningTime="2026-03-17 01:20:03.213320868 +0000 UTC m=+3477.972773161" Mar 17 01:20:04 crc kubenswrapper[4755]: I0317 01:20:04.192139 4755 generic.go:334] "Generic (PLEG): container finished" podID="48f546fc-7c8e-42f5-b540-ac8597ae0e4e" containerID="afce1ae55580b885a4da8cb06b97ad257e835f964788876c444812bbb5bf958d" exitCode=0 Mar 17 01:20:04 crc kubenswrapper[4755]: I0317 01:20:04.192187 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" event={"ID":"48f546fc-7c8e-42f5-b540-ac8597ae0e4e","Type":"ContainerDied","Data":"afce1ae55580b885a4da8cb06b97ad257e835f964788876c444812bbb5bf958d"} Mar 17 01:20:05 crc kubenswrapper[4755]: I0317 01:20:05.654076 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:05 crc kubenswrapper[4755]: I0317 01:20:05.816647 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5xm4\" (UniqueName: \"kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4\") pod \"48f546fc-7c8e-42f5-b540-ac8597ae0e4e\" (UID: \"48f546fc-7c8e-42f5-b540-ac8597ae0e4e\") " Mar 17 01:20:05 crc kubenswrapper[4755]: I0317 01:20:05.826668 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4" (OuterVolumeSpecName: "kube-api-access-l5xm4") pod "48f546fc-7c8e-42f5-b540-ac8597ae0e4e" (UID: "48f546fc-7c8e-42f5-b540-ac8597ae0e4e"). InnerVolumeSpecName "kube-api-access-l5xm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:05 crc kubenswrapper[4755]: I0317 01:20:05.921340 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5xm4\" (UniqueName: \"kubernetes.io/projected/48f546fc-7c8e-42f5-b540-ac8597ae0e4e-kube-api-access-l5xm4\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:06 crc kubenswrapper[4755]: I0317 01:20:06.217195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" event={"ID":"48f546fc-7c8e-42f5-b540-ac8597ae0e4e","Type":"ContainerDied","Data":"3e3205cff2849d27285bc8921410210261ff06d4cc0cde06f0bfbd35d76f7722"} Mar 17 01:20:06 crc kubenswrapper[4755]: I0317 01:20:06.217299 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3205cff2849d27285bc8921410210261ff06d4cc0cde06f0bfbd35d76f7722" Mar 17 01:20:06 crc kubenswrapper[4755]: I0317 01:20:06.217396 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561840-ljcjf" Mar 17 01:20:06 crc kubenswrapper[4755]: I0317 01:20:06.307371 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-cxjnb"] Mar 17 01:20:06 crc kubenswrapper[4755]: I0317 01:20:06.317378 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561834-cxjnb"] Mar 17 01:20:08 crc kubenswrapper[4755]: I0317 01:20:08.266900 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33bf75b-a4c4-4546-91d2-c9cc30d2b369" path="/var/lib/kubelet/pods/b33bf75b-a4c4-4546-91d2-c9cc30d2b369/volumes" Mar 17 01:20:16 crc kubenswrapper[4755]: I0317 01:20:16.342598 4755 generic.go:334] "Generic (PLEG): container finished" podID="768f1228-6ea3-4601-a0e4-93911d1d4fa1" containerID="745a43f2038fd7f3cfa34d86327d98e2dbf528262d241ca4052980491d5384b5" exitCode=0 Mar 17 01:20:16 crc kubenswrapper[4755]: I0317 01:20:16.342656 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" event={"ID":"768f1228-6ea3-4601-a0e4-93911d1d4fa1","Type":"ContainerDied","Data":"745a43f2038fd7f3cfa34d86327d98e2dbf528262d241ca4052980491d5384b5"} Mar 17 01:20:17 crc kubenswrapper[4755]: I0317 01:20:17.870655 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010279 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010356 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010535 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010592 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010639 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010692 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4scwt\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010956 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.010995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.011057 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.011099 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.011147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.011261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph\") pod \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\" (UID: \"768f1228-6ea3-4601-a0e4-93911d1d4fa1\") " Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.017631 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.018180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.018339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.019515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.019566 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.020506 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph" (OuterVolumeSpecName: "ceph") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.021875 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.021952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.022585 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.022635 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt" (OuterVolumeSpecName: "kube-api-access-4scwt") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "kube-api-access-4scwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.023909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.024714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.025481 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.025705 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.026746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.048397 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.062433 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory" (OuterVolumeSpecName: "inventory") pod "768f1228-6ea3-4601-a0e4-93911d1d4fa1" (UID: "768f1228-6ea3-4601-a0e4-93911d1d4fa1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.114768 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115055 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115074 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115089 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115102 4755 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115116 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4scwt\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-kube-api-access-4scwt\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115129 4755 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115143 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115158 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115176 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115191 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115206 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/768f1228-6ea3-4601-a0e4-93911d1d4fa1-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115218 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115257 4755 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115272 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115284 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.115298 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768f1228-6ea3-4601-a0e4-93911d1d4fa1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.364814 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" event={"ID":"768f1228-6ea3-4601-a0e4-93911d1d4fa1","Type":"ContainerDied","Data":"0f979987ef5813967c46f7c4e67b103f6a73e368ecaed5165bcadb8af01d87e1"} Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.364867 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.364874 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f979987ef5813967c46f7c4e67b103f6a73e368ecaed5165bcadb8af01d87e1" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.474071 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b"] Mar 17 01:20:18 crc kubenswrapper[4755]: E0317 01:20:18.474618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f546fc-7c8e-42f5-b540-ac8597ae0e4e" containerName="oc" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.474639 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f546fc-7c8e-42f5-b540-ac8597ae0e4e" containerName="oc" Mar 17 01:20:18 crc kubenswrapper[4755]: E0317 01:20:18.474657 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768f1228-6ea3-4601-a0e4-93911d1d4fa1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.474668 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="768f1228-6ea3-4601-a0e4-93911d1d4fa1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.474965 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f546fc-7c8e-42f5-b540-ac8597ae0e4e" containerName="oc" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.474998 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="768f1228-6ea3-4601-a0e4-93911d1d4fa1" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.475954 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.483213 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.483698 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.483989 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.483995 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.484136 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.509738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b"] Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.625369 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.625428 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.625520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.625585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hh4b\" (UniqueName: \"kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.727579 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.727628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.727675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.727720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hh4b\" (UniqueName: \"kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.731784 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.732084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.739188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.744355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hh4b\" (UniqueName: \"kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:18 crc kubenswrapper[4755]: I0317 01:20:18.819012 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:19 crc kubenswrapper[4755]: I0317 01:20:19.413577 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b"] Mar 17 01:20:20 crc kubenswrapper[4755]: I0317 01:20:20.392928 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" event={"ID":"c924cf0b-5d1b-4d21-8123-106c71d3b94b","Type":"ContainerStarted","Data":"8b5856e771abdd8f6320ef34f4d22c8f5d3de758551718bf93fc394f2041a562"} Mar 17 01:20:25 crc kubenswrapper[4755]: I0317 01:20:25.450956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" event={"ID":"c924cf0b-5d1b-4d21-8123-106c71d3b94b","Type":"ContainerStarted","Data":"d6f788f8756c4328304e9c3d47cd358118f063054206b07c59eed922c742dddd"} Mar 17 01:20:25 crc kubenswrapper[4755]: I0317 01:20:25.486365 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" podStartSLOduration=1.974236103 podStartE2EDuration="7.486340711s" podCreationTimestamp="2026-03-17 01:20:18 +0000 UTC" firstStartedPulling="2026-03-17 01:20:19.417516539 +0000 UTC m=+3494.176968862" lastFinishedPulling="2026-03-17 01:20:24.929621147 +0000 UTC m=+3499.689073470" observedRunningTime="2026-03-17 01:20:25.480294665 +0000 UTC m=+3500.239746978" watchObservedRunningTime="2026-03-17 01:20:25.486340711 +0000 UTC m=+3500.245793034" Mar 17 01:20:31 crc kubenswrapper[4755]: I0317 01:20:31.539105 4755 generic.go:334] "Generic (PLEG): container finished" podID="c924cf0b-5d1b-4d21-8123-106c71d3b94b" containerID="d6f788f8756c4328304e9c3d47cd358118f063054206b07c59eed922c742dddd" exitCode=0 Mar 17 01:20:31 crc kubenswrapper[4755]: I0317 01:20:31.539204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" event={"ID":"c924cf0b-5d1b-4d21-8123-106c71d3b94b","Type":"ContainerDied","Data":"d6f788f8756c4328304e9c3d47cd358118f063054206b07c59eed922c742dddd"} Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.129349 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.180643 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory\") pod \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.180768 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hh4b\" (UniqueName: \"kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b\") pod \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.180980 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam\") pod \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.181028 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph\") pod \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\" (UID: \"c924cf0b-5d1b-4d21-8123-106c71d3b94b\") " Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.187661 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b" (OuterVolumeSpecName: "kube-api-access-4hh4b") pod "c924cf0b-5d1b-4d21-8123-106c71d3b94b" (UID: "c924cf0b-5d1b-4d21-8123-106c71d3b94b"). InnerVolumeSpecName "kube-api-access-4hh4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.189670 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph" (OuterVolumeSpecName: "ceph") pod "c924cf0b-5d1b-4d21-8123-106c71d3b94b" (UID: "c924cf0b-5d1b-4d21-8123-106c71d3b94b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.223450 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c924cf0b-5d1b-4d21-8123-106c71d3b94b" (UID: "c924cf0b-5d1b-4d21-8123-106c71d3b94b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.233175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory" (OuterVolumeSpecName: "inventory") pod "c924cf0b-5d1b-4d21-8123-106c71d3b94b" (UID: "c924cf0b-5d1b-4d21-8123-106c71d3b94b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.284651 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.284705 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.284727 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c924cf0b-5d1b-4d21-8123-106c71d3b94b-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.284748 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hh4b\" (UniqueName: \"kubernetes.io/projected/c924cf0b-5d1b-4d21-8123-106c71d3b94b-kube-api-access-4hh4b\") on node \"crc\" DevicePath \"\"" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.566786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" event={"ID":"c924cf0b-5d1b-4d21-8123-106c71d3b94b","Type":"ContainerDied","Data":"8b5856e771abdd8f6320ef34f4d22c8f5d3de758551718bf93fc394f2041a562"} Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.566827 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5856e771abdd8f6320ef34f4d22c8f5d3de758551718bf93fc394f2041a562" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.566879 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.698118 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt"] Mar 17 01:20:33 crc kubenswrapper[4755]: E0317 01:20:33.698835 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c924cf0b-5d1b-4d21-8123-106c71d3b94b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.698857 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c924cf0b-5d1b-4d21-8123-106c71d3b94b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.699152 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c924cf0b-5d1b-4d21-8123-106c71d3b94b" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.699947 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.704005 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.704120 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.704244 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.704368 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.704566 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.705822 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.716370 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt"] Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.795616 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.795900 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7l5j\" (UniqueName: \"kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.795999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.796172 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.796375 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.796495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.898889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.899027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.899072 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l5j\" (UniqueName: \"kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.899144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.899182 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.899412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.900777 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.904089 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.905701 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.907201 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.911199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:33 crc kubenswrapper[4755]: I0317 01:20:33.924614 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7l5j\" (UniqueName: \"kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fx2nt\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:34 crc kubenswrapper[4755]: I0317 01:20:34.024768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:20:34 crc kubenswrapper[4755]: I0317 01:20:34.666809 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt"] Mar 17 01:20:35 crc kubenswrapper[4755]: I0317 01:20:35.595862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" event={"ID":"58680610-638b-4561-90d2-c13f1074a35b","Type":"ContainerStarted","Data":"71354f7b44c97d341e5b4c5ee9e06642c199da0553b8555259bfbc9833d41c37"} Mar 17 01:20:35 crc kubenswrapper[4755]: I0317 01:20:35.596350 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" event={"ID":"58680610-638b-4561-90d2-c13f1074a35b","Type":"ContainerStarted","Data":"bb1884074510ea6bd0f75f0472ed0feda03776fe0dab5095b3c2907c7ec2672b"} Mar 17 01:20:35 crc kubenswrapper[4755]: I0317 01:20:35.622073 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" podStartSLOduration=2.164625826 podStartE2EDuration="2.62205417s" podCreationTimestamp="2026-03-17 01:20:33 +0000 UTC" firstStartedPulling="2026-03-17 01:20:34.685187287 +0000 UTC m=+3509.444639600" lastFinishedPulling="2026-03-17 01:20:35.142615621 +0000 UTC m=+3509.902067944" observedRunningTime="2026-03-17 01:20:35.616087487 +0000 UTC m=+3510.375539780" watchObservedRunningTime="2026-03-17 01:20:35.62205417 +0000 UTC m=+3510.381506463" Mar 17 01:20:58 crc kubenswrapper[4755]: I0317 01:20:58.665779 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:20:58 crc kubenswrapper[4755]: I0317 01:20:58.666503 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:21:02 crc kubenswrapper[4755]: I0317 01:21:02.921753 4755 scope.go:117] "RemoveContainer" containerID="3b7a76c12a1d6425881cf17712aa28a841af43918dc7b246c41bdb82c4dae977" Mar 17 01:21:28 crc kubenswrapper[4755]: I0317 01:21:28.665315 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:21:28 crc kubenswrapper[4755]: I0317 01:21:28.667748 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:21:54 crc kubenswrapper[4755]: I0317 01:21:54.554306 4755 generic.go:334] "Generic (PLEG): container finished" podID="58680610-638b-4561-90d2-c13f1074a35b" containerID="71354f7b44c97d341e5b4c5ee9e06642c199da0553b8555259bfbc9833d41c37" exitCode=0 Mar 17 01:21:54 crc kubenswrapper[4755]: I0317 01:21:54.554495 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" event={"ID":"58680610-638b-4561-90d2-c13f1074a35b","Type":"ContainerDied","Data":"71354f7b44c97d341e5b4c5ee9e06642c199da0553b8555259bfbc9833d41c37"} Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.245116 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383236 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383306 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7l5j\" (UniqueName: \"kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383399 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.383570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory\") pod \"58680610-638b-4561-90d2-c13f1074a35b\" (UID: \"58680610-638b-4561-90d2-c13f1074a35b\") " Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.389660 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j" (OuterVolumeSpecName: "kube-api-access-n7l5j") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "kube-api-access-n7l5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.390514 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph" (OuterVolumeSpecName: "ceph") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.391224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.416859 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.421002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.421021 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory" (OuterVolumeSpecName: "inventory") pod "58680610-638b-4561-90d2-c13f1074a35b" (UID: "58680610-638b-4561-90d2-c13f1074a35b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485636 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485687 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7l5j\" (UniqueName: \"kubernetes.io/projected/58680610-638b-4561-90d2-c13f1074a35b-kube-api-access-n7l5j\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485696 4755 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/58680610-638b-4561-90d2-c13f1074a35b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485706 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485715 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.485724 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58680610-638b-4561-90d2-c13f1074a35b-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.617872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" event={"ID":"58680610-638b-4561-90d2-c13f1074a35b","Type":"ContainerDied","Data":"bb1884074510ea6bd0f75f0472ed0feda03776fe0dab5095b3c2907c7ec2672b"} Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.617942 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1884074510ea6bd0f75f0472ed0feda03776fe0dab5095b3c2907c7ec2672b" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.617967 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fx2nt" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.691028 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28"] Mar 17 01:21:56 crc kubenswrapper[4755]: E0317 01:21:56.691752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58680610-638b-4561-90d2-c13f1074a35b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.691784 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="58680610-638b-4561-90d2-c13f1074a35b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.692233 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="58680610-638b-4561-90d2-c13f1074a35b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.693493 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699049 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699078 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699392 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699778 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.699943 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.712346 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28"] Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.797940 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.798347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgjk\" (UniqueName: \"kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.798535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.798684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.798868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.799044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.799295 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.900958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.901365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.901583 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgjk\" (UniqueName: \"kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.901802 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.901980 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.902102 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.902272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.907118 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.908594 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.909156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.911188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.911734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.917146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:56 crc kubenswrapper[4755]: I0317 01:21:56.919138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgjk\" (UniqueName: \"kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:57 crc kubenswrapper[4755]: I0317 01:21:57.028602 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:21:57 crc kubenswrapper[4755]: I0317 01:21:57.661092 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28"] Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.642474 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" event={"ID":"ed749590-0c9f-4ed1-876f-d6e28f1e98d2","Type":"ContainerStarted","Data":"110cfb0b3b898d45f6501bde26a9ea0378b625bc3f18eeaa6ed6e62ead14d2f8"} Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.642546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" event={"ID":"ed749590-0c9f-4ed1-876f-d6e28f1e98d2","Type":"ContainerStarted","Data":"9277d0fb9427751dd68327586ed53bd56605d9af9ae69febb0d6fac8ad796c57"} Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.665306 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.665388 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.665468 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.666529 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.666624 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c" gracePeriod=600 Mar 17 01:21:58 crc kubenswrapper[4755]: I0317 01:21:58.668947 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" podStartSLOduration=2.245042493 podStartE2EDuration="2.668924771s" podCreationTimestamp="2026-03-17 01:21:56 +0000 UTC" firstStartedPulling="2026-03-17 01:21:57.657964304 +0000 UTC m=+3592.417416587" lastFinishedPulling="2026-03-17 01:21:58.081846562 +0000 UTC m=+3592.841298865" observedRunningTime="2026-03-17 01:21:58.658786652 +0000 UTC m=+3593.418238935" watchObservedRunningTime="2026-03-17 01:21:58.668924771 +0000 UTC m=+3593.428377074" Mar 17 01:21:59 crc kubenswrapper[4755]: I0317 01:21:59.675466 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c" exitCode=0 Mar 17 01:21:59 crc kubenswrapper[4755]: I0317 01:21:59.675481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c"} Mar 17 01:21:59 crc kubenswrapper[4755]: I0317 01:21:59.676125 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3"} Mar 17 01:21:59 crc kubenswrapper[4755]: I0317 01:21:59.676147 4755 scope.go:117] "RemoveContainer" containerID="7709432042ea6973621180c55fbf470aecf874b7e3491ee06a731b5e4b842f4a" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.137279 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561842-fbz8p"] Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.139645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.143599 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.143789 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.143947 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.162781 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-fbz8p"] Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.276716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99nfj\" (UniqueName: \"kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj\") pod \"auto-csr-approver-29561842-fbz8p\" (UID: \"70715e34-4a49-41ba-a9ef-64be7b21ebbe\") " pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.379627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99nfj\" (UniqueName: \"kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj\") pod \"auto-csr-approver-29561842-fbz8p\" (UID: \"70715e34-4a49-41ba-a9ef-64be7b21ebbe\") " pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.402500 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99nfj\" (UniqueName: \"kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj\") pod \"auto-csr-approver-29561842-fbz8p\" (UID: \"70715e34-4a49-41ba-a9ef-64be7b21ebbe\") " pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.468271 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:00 crc kubenswrapper[4755]: W0317 01:22:00.901685 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70715e34_4a49_41ba_a9ef_64be7b21ebbe.slice/crio-ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2 WatchSource:0}: Error finding container ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2: Status 404 returned error can't find the container with id ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2 Mar 17 01:22:00 crc kubenswrapper[4755]: I0317 01:22:00.903021 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-fbz8p"] Mar 17 01:22:01 crc kubenswrapper[4755]: I0317 01:22:01.720963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" event={"ID":"70715e34-4a49-41ba-a9ef-64be7b21ebbe","Type":"ContainerStarted","Data":"ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2"} Mar 17 01:22:02 crc kubenswrapper[4755]: I0317 01:22:02.733124 4755 generic.go:334] "Generic (PLEG): container finished" podID="70715e34-4a49-41ba-a9ef-64be7b21ebbe" containerID="34a76bf9817658e616d792325755fc68e5e53544b0580124761732a3e705a19e" exitCode=0 Mar 17 01:22:02 crc kubenswrapper[4755]: I0317 01:22:02.733269 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" event={"ID":"70715e34-4a49-41ba-a9ef-64be7b21ebbe","Type":"ContainerDied","Data":"34a76bf9817658e616d792325755fc68e5e53544b0580124761732a3e705a19e"} Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.204551 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.277274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99nfj\" (UniqueName: \"kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj\") pod \"70715e34-4a49-41ba-a9ef-64be7b21ebbe\" (UID: \"70715e34-4a49-41ba-a9ef-64be7b21ebbe\") " Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.283024 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj" (OuterVolumeSpecName: "kube-api-access-99nfj") pod "70715e34-4a49-41ba-a9ef-64be7b21ebbe" (UID: "70715e34-4a49-41ba-a9ef-64be7b21ebbe"). InnerVolumeSpecName "kube-api-access-99nfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.380177 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99nfj\" (UniqueName: \"kubernetes.io/projected/70715e34-4a49-41ba-a9ef-64be7b21ebbe-kube-api-access-99nfj\") on node \"crc\" DevicePath \"\"" Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.761868 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" event={"ID":"70715e34-4a49-41ba-a9ef-64be7b21ebbe","Type":"ContainerDied","Data":"ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2"} Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.761942 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb238f45476fec4ffc67f7f24f8ef38b7c8659f6eeb06e31469f9ac82f1cef2" Mar 17 01:22:04 crc kubenswrapper[4755]: I0317 01:22:04.761941 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561842-fbz8p" Mar 17 01:22:05 crc kubenswrapper[4755]: I0317 01:22:05.299708 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-gkv2s"] Mar 17 01:22:05 crc kubenswrapper[4755]: I0317 01:22:05.309023 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561836-gkv2s"] Mar 17 01:22:06 crc kubenswrapper[4755]: I0317 01:22:06.262647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90071ff4-8824-4645-9881-3ff6157ac1f9" path="/var/lib/kubelet/pods/90071ff4-8824-4645-9881-3ff6157ac1f9/volumes" Mar 17 01:23:03 crc kubenswrapper[4755]: I0317 01:23:03.116648 4755 scope.go:117] "RemoveContainer" containerID="050eb673b3cf4efa16043552507b74c0e98c31c0da71f841aa5b4a234928f29b" Mar 17 01:23:12 crc kubenswrapper[4755]: I0317 01:23:12.628272 4755 generic.go:334] "Generic (PLEG): container finished" podID="ed749590-0c9f-4ed1-876f-d6e28f1e98d2" containerID="110cfb0b3b898d45f6501bde26a9ea0378b625bc3f18eeaa6ed6e62ead14d2f8" exitCode=0 Mar 17 01:23:12 crc kubenswrapper[4755]: I0317 01:23:12.628392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" event={"ID":"ed749590-0c9f-4ed1-876f-d6e28f1e98d2","Type":"ContainerDied","Data":"110cfb0b3b898d45f6501bde26a9ea0378b625bc3f18eeaa6ed6e62ead14d2f8"} Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.202246 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.261654 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.261890 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgjk\" (UniqueName: \"kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.262197 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.262285 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.262369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.262460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.262593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0\") pod \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\" (UID: \"ed749590-0c9f-4ed1-876f-d6e28f1e98d2\") " Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.271549 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph" (OuterVolumeSpecName: "ceph") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.274423 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk" (OuterVolumeSpecName: "kube-api-access-fcgjk") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "kube-api-access-fcgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.289327 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.295464 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.306776 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.308791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory" (OuterVolumeSpecName: "inventory") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.333547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ed749590-0c9f-4ed1-876f-d6e28f1e98d2" (UID: "ed749590-0c9f-4ed1-876f-d6e28f1e98d2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368309 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368346 4755 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368358 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368368 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368377 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368387 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcgjk\" (UniqueName: \"kubernetes.io/projected/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-kube-api-access-fcgjk\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.368396 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed749590-0c9f-4ed1-876f-d6e28f1e98d2-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.651313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" event={"ID":"ed749590-0c9f-4ed1-876f-d6e28f1e98d2","Type":"ContainerDied","Data":"9277d0fb9427751dd68327586ed53bd56605d9af9ae69febb0d6fac8ad796c57"} Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.651351 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9277d0fb9427751dd68327586ed53bd56605d9af9ae69febb0d6fac8ad796c57" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.651424 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.837391 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85"] Mar 17 01:23:14 crc kubenswrapper[4755]: E0317 01:23:14.837934 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed749590-0c9f-4ed1-876f-d6e28f1e98d2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.837956 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed749590-0c9f-4ed1-876f-d6e28f1e98d2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:23:14 crc kubenswrapper[4755]: E0317 01:23:14.837997 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70715e34-4a49-41ba-a9ef-64be7b21ebbe" containerName="oc" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.838005 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="70715e34-4a49-41ba-a9ef-64be7b21ebbe" containerName="oc" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.838237 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed749590-0c9f-4ed1-876f-d6e28f1e98d2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.838270 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="70715e34-4a49-41ba-a9ef-64be7b21ebbe" containerName="oc" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.839123 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.844003 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.844522 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.845546 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.845660 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.846045 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.848899 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.852814 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85"] Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.883552 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.883755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.884121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.884203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.884244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.884305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hck9p\" (UniqueName: \"kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.987233 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.987658 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.987771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.987813 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.987879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hck9p\" (UniqueName: \"kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.988031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.991628 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.993009 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.993090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.993619 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:14 crc kubenswrapper[4755]: I0317 01:23:14.995012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:15 crc kubenswrapper[4755]: I0317 01:23:15.012300 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hck9p\" (UniqueName: \"kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rjx85\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:15 crc kubenswrapper[4755]: I0317 01:23:15.163115 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:23:16 crc kubenswrapper[4755]: W0317 01:23:16.494878 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1b1ccf6_0ac6_4724_a948_b2a858b7cdf8.slice/crio-be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206 WatchSource:0}: Error finding container be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206: Status 404 returned error can't find the container with id be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206 Mar 17 01:23:16 crc kubenswrapper[4755]: I0317 01:23:16.495738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85"] Mar 17 01:23:16 crc kubenswrapper[4755]: I0317 01:23:16.674280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" event={"ID":"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8","Type":"ContainerStarted","Data":"be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206"} Mar 17 01:23:17 crc kubenswrapper[4755]: I0317 01:23:17.685905 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" event={"ID":"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8","Type":"ContainerStarted","Data":"aaba2532d5b33239cfe8165b512b6236d26091b1de719ad246fea280215aa93f"} Mar 17 01:23:17 crc kubenswrapper[4755]: I0317 01:23:17.722936 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" podStartSLOduration=3.032427444 podStartE2EDuration="3.722911779s" podCreationTimestamp="2026-03-17 01:23:14 +0000 UTC" firstStartedPulling="2026-03-17 01:23:16.498747384 +0000 UTC m=+3671.258199667" lastFinishedPulling="2026-03-17 01:23:17.189231719 +0000 UTC m=+3671.948684002" observedRunningTime="2026-03-17 01:23:17.706176047 +0000 UTC m=+3672.465628400" watchObservedRunningTime="2026-03-17 01:23:17.722911779 +0000 UTC m=+3672.482364062" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.165621 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561844-ztz95"] Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.169067 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.174313 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.174658 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.174930 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.181472 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-ztz95"] Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.316129 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ldq\" (UniqueName: \"kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq\") pod \"auto-csr-approver-29561844-ztz95\" (UID: \"0d89fd23-639a-4219-b543-fbde76ca3626\") " pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.419005 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ldq\" (UniqueName: \"kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq\") pod \"auto-csr-approver-29561844-ztz95\" (UID: \"0d89fd23-639a-4219-b543-fbde76ca3626\") " pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.454024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ldq\" (UniqueName: \"kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq\") pod \"auto-csr-approver-29561844-ztz95\" (UID: \"0d89fd23-639a-4219-b543-fbde76ca3626\") " pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.522426 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.869877 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-ztz95"] Mar 17 01:24:00 crc kubenswrapper[4755]: W0317 01:24:00.871939 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d89fd23_639a_4219_b543_fbde76ca3626.slice/crio-87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446 WatchSource:0}: Error finding container 87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446: Status 404 returned error can't find the container with id 87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446 Mar 17 01:24:00 crc kubenswrapper[4755]: I0317 01:24:00.873855 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:24:01 crc kubenswrapper[4755]: I0317 01:24:01.244501 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-ztz95" event={"ID":"0d89fd23-639a-4219-b543-fbde76ca3626","Type":"ContainerStarted","Data":"87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446"} Mar 17 01:24:03 crc kubenswrapper[4755]: I0317 01:24:03.273198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-ztz95" event={"ID":"0d89fd23-639a-4219-b543-fbde76ca3626","Type":"ContainerDied","Data":"64d8c797f1fad0db5e2b0fb903bf4587206c5333056a46f0d356964d6edbc825"} Mar 17 01:24:03 crc kubenswrapper[4755]: I0317 01:24:03.273043 4755 generic.go:334] "Generic (PLEG): container finished" podID="0d89fd23-639a-4219-b543-fbde76ca3626" containerID="64d8c797f1fad0db5e2b0fb903bf4587206c5333056a46f0d356964d6edbc825" exitCode=0 Mar 17 01:24:04 crc kubenswrapper[4755]: I0317 01:24:04.711035 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:04 crc kubenswrapper[4755]: I0317 01:24:04.825677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2ldq\" (UniqueName: \"kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq\") pod \"0d89fd23-639a-4219-b543-fbde76ca3626\" (UID: \"0d89fd23-639a-4219-b543-fbde76ca3626\") " Mar 17 01:24:04 crc kubenswrapper[4755]: I0317 01:24:04.833580 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq" (OuterVolumeSpecName: "kube-api-access-m2ldq") pod "0d89fd23-639a-4219-b543-fbde76ca3626" (UID: "0d89fd23-639a-4219-b543-fbde76ca3626"). InnerVolumeSpecName "kube-api-access-m2ldq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:24:04 crc kubenswrapper[4755]: I0317 01:24:04.927899 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2ldq\" (UniqueName: \"kubernetes.io/projected/0d89fd23-639a-4219-b543-fbde76ca3626-kube-api-access-m2ldq\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:05 crc kubenswrapper[4755]: I0317 01:24:05.304954 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561844-ztz95" event={"ID":"0d89fd23-639a-4219-b543-fbde76ca3626","Type":"ContainerDied","Data":"87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446"} Mar 17 01:24:05 crc kubenswrapper[4755]: I0317 01:24:05.304997 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a53eca10d3be6f925db356a632e5806aa93a3ebe042ce05956dedf13f90446" Mar 17 01:24:05 crc kubenswrapper[4755]: I0317 01:24:05.305051 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561844-ztz95" Mar 17 01:24:05 crc kubenswrapper[4755]: I0317 01:24:05.809318 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-89zzc"] Mar 17 01:24:05 crc kubenswrapper[4755]: I0317 01:24:05.821950 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561838-89zzc"] Mar 17 01:24:06 crc kubenswrapper[4755]: I0317 01:24:06.273128 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64208ce-d13d-4744-827f-c5ab7b9ffc6e" path="/var/lib/kubelet/pods/a64208ce-d13d-4744-827f-c5ab7b9ffc6e/volumes" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.198286 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:26 crc kubenswrapper[4755]: E0317 01:24:26.199303 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d89fd23-639a-4219-b543-fbde76ca3626" containerName="oc" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.199317 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d89fd23-639a-4219-b543-fbde76ca3626" containerName="oc" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.199536 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d89fd23-639a-4219-b543-fbde76ca3626" containerName="oc" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.202764 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.219690 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.361423 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.362103 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksx6\" (UniqueName: \"kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.362403 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.466805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.466905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.467119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksx6\" (UniqueName: \"kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.467531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.467552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.486137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksx6\" (UniqueName: \"kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6\") pod \"redhat-operators-ttgdp\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:26 crc kubenswrapper[4755]: I0317 01:24:26.574500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:27 crc kubenswrapper[4755]: I0317 01:24:27.035008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:27 crc kubenswrapper[4755]: I0317 01:24:27.827201 4755 generic.go:334] "Generic (PLEG): container finished" podID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerID="61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1" exitCode=0 Mar 17 01:24:27 crc kubenswrapper[4755]: I0317 01:24:27.827255 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerDied","Data":"61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1"} Mar 17 01:24:27 crc kubenswrapper[4755]: I0317 01:24:27.827479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerStarted","Data":"c69207a6d8fd6744218a0648c9a1a61d699d1619d8b6f5dcb443b6ec60beb9bb"} Mar 17 01:24:28 crc kubenswrapper[4755]: I0317 01:24:28.665694 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:24:28 crc kubenswrapper[4755]: I0317 01:24:28.666402 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:24:29 crc kubenswrapper[4755]: I0317 01:24:29.856814 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerStarted","Data":"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f"} Mar 17 01:24:31 crc kubenswrapper[4755]: I0317 01:24:31.887419 4755 generic.go:334] "Generic (PLEG): container finished" podID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerID="13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f" exitCode=0 Mar 17 01:24:31 crc kubenswrapper[4755]: I0317 01:24:31.887502 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerDied","Data":"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f"} Mar 17 01:24:32 crc kubenswrapper[4755]: I0317 01:24:32.904189 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerStarted","Data":"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97"} Mar 17 01:24:32 crc kubenswrapper[4755]: I0317 01:24:32.940589 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttgdp" podStartSLOduration=2.435566189 podStartE2EDuration="6.94056614s" podCreationTimestamp="2026-03-17 01:24:26 +0000 UTC" firstStartedPulling="2026-03-17 01:24:27.831228381 +0000 UTC m=+3742.590680664" lastFinishedPulling="2026-03-17 01:24:32.336228292 +0000 UTC m=+3747.095680615" observedRunningTime="2026-03-17 01:24:32.938876494 +0000 UTC m=+3747.698328777" watchObservedRunningTime="2026-03-17 01:24:32.94056614 +0000 UTC m=+3747.700018423" Mar 17 01:24:36 crc kubenswrapper[4755]: I0317 01:24:36.575548 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:36 crc kubenswrapper[4755]: I0317 01:24:36.576082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:37 crc kubenswrapper[4755]: I0317 01:24:37.647553 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttgdp" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="registry-server" probeResult="failure" output=< Mar 17 01:24:37 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:24:37 crc kubenswrapper[4755]: > Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.089095 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.091473 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.119714 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.277332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.277632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxltd\" (UniqueName: \"kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.277741 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.379358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxltd\" (UniqueName: \"kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.380639 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.381367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.381173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.381686 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.404800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxltd\" (UniqueName: \"kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd\") pod \"community-operators-9djsz\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:38 crc kubenswrapper[4755]: I0317 01:24:38.426458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:39 crc kubenswrapper[4755]: I0317 01:24:39.091941 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:39 crc kubenswrapper[4755]: I0317 01:24:39.990372 4755 generic.go:334] "Generic (PLEG): container finished" podID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerID="98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5" exitCode=0 Mar 17 01:24:39 crc kubenswrapper[4755]: I0317 01:24:39.990432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerDied","Data":"98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5"} Mar 17 01:24:39 crc kubenswrapper[4755]: I0317 01:24:39.990688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerStarted","Data":"45527dd72f55b0d5f2cea1e8f7b66e96f71ec4743bc451f371caec7acbd819ca"} Mar 17 01:24:41 crc kubenswrapper[4755]: I0317 01:24:41.005344 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerStarted","Data":"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74"} Mar 17 01:24:43 crc kubenswrapper[4755]: I0317 01:24:43.032557 4755 generic.go:334] "Generic (PLEG): container finished" podID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerID="57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74" exitCode=0 Mar 17 01:24:43 crc kubenswrapper[4755]: I0317 01:24:43.032643 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerDied","Data":"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74"} Mar 17 01:24:44 crc kubenswrapper[4755]: I0317 01:24:44.044688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerStarted","Data":"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83"} Mar 17 01:24:46 crc kubenswrapper[4755]: I0317 01:24:46.618097 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:46 crc kubenswrapper[4755]: I0317 01:24:46.636799 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9djsz" podStartSLOduration=5.195630728 podStartE2EDuration="8.636781312s" podCreationTimestamp="2026-03-17 01:24:38 +0000 UTC" firstStartedPulling="2026-03-17 01:24:39.992011813 +0000 UTC m=+3754.751464096" lastFinishedPulling="2026-03-17 01:24:43.433162387 +0000 UTC m=+3758.192614680" observedRunningTime="2026-03-17 01:24:44.092988192 +0000 UTC m=+3758.852440485" watchObservedRunningTime="2026-03-17 01:24:46.636781312 +0000 UTC m=+3761.396233595" Mar 17 01:24:46 crc kubenswrapper[4755]: I0317 01:24:46.671784 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:46 crc kubenswrapper[4755]: I0317 01:24:46.874208 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.086291 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttgdp" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="registry-server" containerID="cri-o://5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97" gracePeriod=2 Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.426827 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.427164 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.507913 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.579292 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.637802 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content\") pod \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.637843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksx6\" (UniqueName: \"kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6\") pod \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.637882 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities\") pod \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\" (UID: \"3fdbef0c-9c8c-4d3d-afdc-487210e86652\") " Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.638802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities" (OuterVolumeSpecName: "utilities") pod "3fdbef0c-9c8c-4d3d-afdc-487210e86652" (UID: "3fdbef0c-9c8c-4d3d-afdc-487210e86652"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.646364 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6" (OuterVolumeSpecName: "kube-api-access-4ksx6") pod "3fdbef0c-9c8c-4d3d-afdc-487210e86652" (UID: "3fdbef0c-9c8c-4d3d-afdc-487210e86652"). InnerVolumeSpecName "kube-api-access-4ksx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.740011 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksx6\" (UniqueName: \"kubernetes.io/projected/3fdbef0c-9c8c-4d3d-afdc-487210e86652-kube-api-access-4ksx6\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.740049 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.798934 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fdbef0c-9c8c-4d3d-afdc-487210e86652" (UID: "3fdbef0c-9c8c-4d3d-afdc-487210e86652"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:24:48 crc kubenswrapper[4755]: I0317 01:24:48.842523 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fdbef0c-9c8c-4d3d-afdc-487210e86652-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.098229 4755 generic.go:334] "Generic (PLEG): container finished" podID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerID="5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97" exitCode=0 Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.098341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerDied","Data":"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97"} Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.098379 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttgdp" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.098414 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttgdp" event={"ID":"3fdbef0c-9c8c-4d3d-afdc-487210e86652","Type":"ContainerDied","Data":"c69207a6d8fd6744218a0648c9a1a61d699d1619d8b6f5dcb443b6ec60beb9bb"} Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.098456 4755 scope.go:117] "RemoveContainer" containerID="5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.141464 4755 scope.go:117] "RemoveContainer" containerID="13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.147920 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.157668 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttgdp"] Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.185399 4755 scope.go:117] "RemoveContainer" containerID="61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.190794 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.242648 4755 scope.go:117] "RemoveContainer" containerID="5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97" Mar 17 01:24:49 crc kubenswrapper[4755]: E0317 01:24:49.243223 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97\": container with ID starting with 5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97 not found: ID does not exist" containerID="5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.243274 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97"} err="failed to get container status \"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97\": rpc error: code = NotFound desc = could not find container \"5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97\": container with ID starting with 5a1b18321e5c4625e78867df78f09a56bf972c9c790ec2d9238fc012321fef97 not found: ID does not exist" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.243308 4755 scope.go:117] "RemoveContainer" containerID="13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f" Mar 17 01:24:49 crc kubenswrapper[4755]: E0317 01:24:49.243674 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f\": container with ID starting with 13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f not found: ID does not exist" containerID="13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.243734 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f"} err="failed to get container status \"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f\": rpc error: code = NotFound desc = could not find container \"13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f\": container with ID starting with 13b6ed2f1b4587cafc686882dc292dbafaf0db836d28db34c3074be86e3a403f not found: ID does not exist" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.243773 4755 scope.go:117] "RemoveContainer" containerID="61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1" Mar 17 01:24:49 crc kubenswrapper[4755]: E0317 01:24:49.244049 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1\": container with ID starting with 61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1 not found: ID does not exist" containerID="61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.244076 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1"} err="failed to get container status \"61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1\": rpc error: code = NotFound desc = could not find container \"61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1\": container with ID starting with 61967103d63dee712aceb9bb99dd5a5b65dd751e822a668001fc13fb9a75b2c1 not found: ID does not exist" Mar 17 01:24:49 crc kubenswrapper[4755]: I0317 01:24:49.859595 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:50 crc kubenswrapper[4755]: I0317 01:24:50.271303 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" path="/var/lib/kubelet/pods/3fdbef0c-9c8c-4d3d-afdc-487210e86652/volumes" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.120006 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9djsz" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="registry-server" containerID="cri-o://33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83" gracePeriod=2 Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.627214 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.811065 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities\") pod \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.811314 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content\") pod \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.811404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxltd\" (UniqueName: \"kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd\") pod \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\" (UID: \"28bb51fa-c810-4f4b-bcf9-310334fc2c84\") " Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.812289 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities" (OuterVolumeSpecName: "utilities") pod "28bb51fa-c810-4f4b-bcf9-310334fc2c84" (UID: "28bb51fa-c810-4f4b-bcf9-310334fc2c84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.821836 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd" (OuterVolumeSpecName: "kube-api-access-vxltd") pod "28bb51fa-c810-4f4b-bcf9-310334fc2c84" (UID: "28bb51fa-c810-4f4b-bcf9-310334fc2c84"). InnerVolumeSpecName "kube-api-access-vxltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.874463 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28bb51fa-c810-4f4b-bcf9-310334fc2c84" (UID: "28bb51fa-c810-4f4b-bcf9-310334fc2c84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.913590 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.913619 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28bb51fa-c810-4f4b-bcf9-310334fc2c84-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:51 crc kubenswrapper[4755]: I0317 01:24:51.913629 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxltd\" (UniqueName: \"kubernetes.io/projected/28bb51fa-c810-4f4b-bcf9-310334fc2c84-kube-api-access-vxltd\") on node \"crc\" DevicePath \"\"" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.133987 4755 generic.go:334] "Generic (PLEG): container finished" podID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerID="33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83" exitCode=0 Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.134041 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9djsz" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.134056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerDied","Data":"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83"} Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.134110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9djsz" event={"ID":"28bb51fa-c810-4f4b-bcf9-310334fc2c84","Type":"ContainerDied","Data":"45527dd72f55b0d5f2cea1e8f7b66e96f71ec4743bc451f371caec7acbd819ca"} Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.134142 4755 scope.go:117] "RemoveContainer" containerID="33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.158413 4755 scope.go:117] "RemoveContainer" containerID="57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.187262 4755 scope.go:117] "RemoveContainer" containerID="98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.201364 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.227490 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9djsz"] Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.258274 4755 scope.go:117] "RemoveContainer" containerID="33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83" Mar 17 01:24:52 crc kubenswrapper[4755]: E0317 01:24:52.259365 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83\": container with ID starting with 33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83 not found: ID does not exist" containerID="33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.259408 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83"} err="failed to get container status \"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83\": rpc error: code = NotFound desc = could not find container \"33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83\": container with ID starting with 33b712b172e74da3a5aaa0b702b36842e1cfc34efdab7d4518b1061f9a34dd83 not found: ID does not exist" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.259430 4755 scope.go:117] "RemoveContainer" containerID="57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74" Mar 17 01:24:52 crc kubenswrapper[4755]: E0317 01:24:52.261828 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74\": container with ID starting with 57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74 not found: ID does not exist" containerID="57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.261883 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74"} err="failed to get container status \"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74\": rpc error: code = NotFound desc = could not find container \"57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74\": container with ID starting with 57e43204f4c71e9a37804324303b6c84576be4573c8c5f9354a595878c1d2f74 not found: ID does not exist" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.261902 4755 scope.go:117] "RemoveContainer" containerID="98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5" Mar 17 01:24:52 crc kubenswrapper[4755]: E0317 01:24:52.262197 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5\": container with ID starting with 98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5 not found: ID does not exist" containerID="98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.262226 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5"} err="failed to get container status \"98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5\": rpc error: code = NotFound desc = could not find container \"98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5\": container with ID starting with 98e2142842c96cbeba71b1c697a1f98f8c9f9f32b718906de84de0ebd4c7eea5 not found: ID does not exist" Mar 17 01:24:52 crc kubenswrapper[4755]: I0317 01:24:52.268055 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" path="/var/lib/kubelet/pods/28bb51fa-c810-4f4b-bcf9-310334fc2c84/volumes" Mar 17 01:24:58 crc kubenswrapper[4755]: I0317 01:24:58.665009 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:24:58 crc kubenswrapper[4755]: I0317 01:24:58.665747 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:25:03 crc kubenswrapper[4755]: I0317 01:25:03.264487 4755 scope.go:117] "RemoveContainer" containerID="8b63c777b499d558baa5feae5f355acb96d1435d3267b7c21d51c8d16fb4c1bc" Mar 17 01:25:28 crc kubenswrapper[4755]: I0317 01:25:28.665101 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:25:28 crc kubenswrapper[4755]: I0317 01:25:28.666020 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:25:28 crc kubenswrapper[4755]: I0317 01:25:28.666159 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:25:28 crc kubenswrapper[4755]: I0317 01:25:28.668362 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:25:28 crc kubenswrapper[4755]: I0317 01:25:28.668589 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" gracePeriod=600 Mar 17 01:25:28 crc kubenswrapper[4755]: E0317 01:25:28.802899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:25:29 crc kubenswrapper[4755]: I0317 01:25:29.611912 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" exitCode=0 Mar 17 01:25:29 crc kubenswrapper[4755]: I0317 01:25:29.611974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3"} Mar 17 01:25:29 crc kubenswrapper[4755]: I0317 01:25:29.612046 4755 scope.go:117] "RemoveContainer" containerID="003f35b0d8f532865a4a2bc8ebf824d038eb9d701bcbc727462822acff8c247c" Mar 17 01:25:29 crc kubenswrapper[4755]: I0317 01:25:29.613530 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:25:29 crc kubenswrapper[4755]: E0317 01:25:29.614086 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:25:44 crc kubenswrapper[4755]: I0317 01:25:44.248257 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:25:44 crc kubenswrapper[4755]: E0317 01:25:44.249316 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:25:59 crc kubenswrapper[4755]: I0317 01:25:59.249222 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:25:59 crc kubenswrapper[4755]: E0317 01:25:59.250775 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.169351 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561846-5rdbg"] Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170647 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="extract-utilities" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.170683 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="extract-utilities" Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170705 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="extract-content" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.170721 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="extract-content" Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170742 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.170760 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170784 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="extract-content" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.170801 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="extract-content" Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170941 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="extract-utilities" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.170959 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="extract-utilities" Mar 17 01:26:00 crc kubenswrapper[4755]: E0317 01:26:00.170995 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.171010 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.171462 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fdbef0c-9c8c-4d3d-afdc-487210e86652" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.171530 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="28bb51fa-c810-4f4b-bcf9-310334fc2c84" containerName="registry-server" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.173044 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.177590 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.177969 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.178323 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.189257 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-5rdbg"] Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.222618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgthm\" (UniqueName: \"kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm\") pod \"auto-csr-approver-29561846-5rdbg\" (UID: \"4c16f6d1-a361-4e74-9763-8da025eb71d4\") " pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.325567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgthm\" (UniqueName: \"kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm\") pod \"auto-csr-approver-29561846-5rdbg\" (UID: \"4c16f6d1-a361-4e74-9763-8da025eb71d4\") " pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.351293 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgthm\" (UniqueName: \"kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm\") pod \"auto-csr-approver-29561846-5rdbg\" (UID: \"4c16f6d1-a361-4e74-9763-8da025eb71d4\") " pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:00 crc kubenswrapper[4755]: I0317 01:26:00.500241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:01 crc kubenswrapper[4755]: W0317 01:26:01.056948 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c16f6d1_a361_4e74_9763_8da025eb71d4.slice/crio-081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818 WatchSource:0}: Error finding container 081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818: Status 404 returned error can't find the container with id 081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818 Mar 17 01:26:01 crc kubenswrapper[4755]: I0317 01:26:01.056989 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-5rdbg"] Mar 17 01:26:02 crc kubenswrapper[4755]: I0317 01:26:02.000419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" event={"ID":"4c16f6d1-a361-4e74-9763-8da025eb71d4","Type":"ContainerStarted","Data":"081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818"} Mar 17 01:26:04 crc kubenswrapper[4755]: I0317 01:26:04.026484 4755 generic.go:334] "Generic (PLEG): container finished" podID="4c16f6d1-a361-4e74-9763-8da025eb71d4" containerID="9d2bd2449952f9e224b5c28eb90dcb121002b850f1158830362fe4abef8a768d" exitCode=0 Mar 17 01:26:04 crc kubenswrapper[4755]: I0317 01:26:04.026558 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" event={"ID":"4c16f6d1-a361-4e74-9763-8da025eb71d4","Type":"ContainerDied","Data":"9d2bd2449952f9e224b5c28eb90dcb121002b850f1158830362fe4abef8a768d"} Mar 17 01:26:05 crc kubenswrapper[4755]: I0317 01:26:05.468956 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:05 crc kubenswrapper[4755]: I0317 01:26:05.649031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgthm\" (UniqueName: \"kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm\") pod \"4c16f6d1-a361-4e74-9763-8da025eb71d4\" (UID: \"4c16f6d1-a361-4e74-9763-8da025eb71d4\") " Mar 17 01:26:05 crc kubenswrapper[4755]: I0317 01:26:05.658339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm" (OuterVolumeSpecName: "kube-api-access-lgthm") pod "4c16f6d1-a361-4e74-9763-8da025eb71d4" (UID: "4c16f6d1-a361-4e74-9763-8da025eb71d4"). InnerVolumeSpecName "kube-api-access-lgthm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:26:05 crc kubenswrapper[4755]: I0317 01:26:05.752884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgthm\" (UniqueName: \"kubernetes.io/projected/4c16f6d1-a361-4e74-9763-8da025eb71d4-kube-api-access-lgthm\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:06 crc kubenswrapper[4755]: I0317 01:26:06.062666 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" event={"ID":"4c16f6d1-a361-4e74-9763-8da025eb71d4","Type":"ContainerDied","Data":"081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818"} Mar 17 01:26:06 crc kubenswrapper[4755]: I0317 01:26:06.062704 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081107b098e2ac1644f753bd549309aaafef43957a780e53dc6074bf3936d818" Mar 17 01:26:06 crc kubenswrapper[4755]: I0317 01:26:06.062766 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561846-5rdbg" Mar 17 01:26:06 crc kubenswrapper[4755]: I0317 01:26:06.560397 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-ljcjf"] Mar 17 01:26:06 crc kubenswrapper[4755]: I0317 01:26:06.573287 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561840-ljcjf"] Mar 17 01:26:08 crc kubenswrapper[4755]: I0317 01:26:08.269329 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f546fc-7c8e-42f5-b540-ac8597ae0e4e" path="/var/lib/kubelet/pods/48f546fc-7c8e-42f5-b540-ac8597ae0e4e/volumes" Mar 17 01:26:13 crc kubenswrapper[4755]: I0317 01:26:13.247989 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:26:13 crc kubenswrapper[4755]: E0317 01:26:13.248840 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:26:15 crc kubenswrapper[4755]: I0317 01:26:15.180063 4755 generic.go:334] "Generic (PLEG): container finished" podID="e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" containerID="aaba2532d5b33239cfe8165b512b6236d26091b1de719ad246fea280215aa93f" exitCode=0 Mar 17 01:26:15 crc kubenswrapper[4755]: I0317 01:26:15.180174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" event={"ID":"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8","Type":"ContainerDied","Data":"aaba2532d5b33239cfe8165b512b6236d26091b1de719ad246fea280215aa93f"} Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.745532 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hck9p\" (UniqueName: \"kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824633 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824737 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.824828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph\") pod \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\" (UID: \"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8\") " Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.831092 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph" (OuterVolumeSpecName: "ceph") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.831742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p" (OuterVolumeSpecName: "kube-api-access-hck9p") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "kube-api-access-hck9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.832946 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.862525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.866076 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory" (OuterVolumeSpecName: "inventory") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.872572 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" (UID: "e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926794 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926831 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hck9p\" (UniqueName: \"kubernetes.io/projected/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-kube-api-access-hck9p\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926850 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926861 4755 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926873 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:16 crc kubenswrapper[4755]: I0317 01:26:16.926884 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.204530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" event={"ID":"e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8","Type":"ContainerDied","Data":"be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206"} Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.204591 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be385d2fe69a4767e7e843d4237612442776cfd1255a50f544137fa532028206" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.204615 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rjx85" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.311155 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n"] Mar 17 01:26:17 crc kubenswrapper[4755]: E0317 01:26:17.311697 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c16f6d1-a361-4e74-9763-8da025eb71d4" containerName="oc" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.311716 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c16f6d1-a361-4e74-9763-8da025eb71d4" containerName="oc" Mar 17 01:26:17 crc kubenswrapper[4755]: E0317 01:26:17.311748 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.311757 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.312004 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c16f6d1-a361-4e74-9763-8da025eb71d4" containerName="oc" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.312027 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.312916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.315345 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.315588 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.317424 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.317614 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.317622 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.317623 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.317915 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.318044 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.321807 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.349288 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n"] Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442312 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442514 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442709 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmn2\" (UniqueName: \"kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442812 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.442970 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.443058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.443170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545402 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmn2\" (UniqueName: \"kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545629 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.545762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546571 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546654 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.546771 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.547364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.547413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.550265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.551458 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.552245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.552285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.552851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.553639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.554368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.555065 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.556611 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.556779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.559973 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.567949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmn2\" (UniqueName: \"kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:17 crc kubenswrapper[4755]: I0317 01:26:17.643024 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:26:18 crc kubenswrapper[4755]: I0317 01:26:18.275943 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n"] Mar 17 01:26:19 crc kubenswrapper[4755]: I0317 01:26:19.236585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" event={"ID":"d0c84d8b-60dc-4e23-a4be-83b81c52f10f","Type":"ContainerStarted","Data":"7f82c5fda6dff242f42a6e8902cd2dc6c9e680bd763c9d2132312c3e738ae693"} Mar 17 01:26:19 crc kubenswrapper[4755]: I0317 01:26:19.237092 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" event={"ID":"d0c84d8b-60dc-4e23-a4be-83b81c52f10f","Type":"ContainerStarted","Data":"99ae71f2f86dbb92635a16354f01251504c6564545675afdb834d9fd8cb29730"} Mar 17 01:26:19 crc kubenswrapper[4755]: I0317 01:26:19.263767 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" podStartSLOduration=1.815634137 podStartE2EDuration="2.263741809s" podCreationTimestamp="2026-03-17 01:26:17 +0000 UTC" firstStartedPulling="2026-03-17 01:26:18.295747109 +0000 UTC m=+3853.055199392" lastFinishedPulling="2026-03-17 01:26:18.743854771 +0000 UTC m=+3853.503307064" observedRunningTime="2026-03-17 01:26:19.257914152 +0000 UTC m=+3854.017366435" watchObservedRunningTime="2026-03-17 01:26:19.263741809 +0000 UTC m=+3854.023194122" Mar 17 01:26:27 crc kubenswrapper[4755]: I0317 01:26:27.250077 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:26:27 crc kubenswrapper[4755]: E0317 01:26:27.252681 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:26:38 crc kubenswrapper[4755]: I0317 01:26:38.248793 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:26:38 crc kubenswrapper[4755]: E0317 01:26:38.249568 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:26:49 crc kubenswrapper[4755]: I0317 01:26:49.248977 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:26:49 crc kubenswrapper[4755]: E0317 01:26:49.250278 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:27:03 crc kubenswrapper[4755]: I0317 01:27:03.249070 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:27:03 crc kubenswrapper[4755]: E0317 01:27:03.250269 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:27:03 crc kubenswrapper[4755]: I0317 01:27:03.452283 4755 scope.go:117] "RemoveContainer" containerID="afce1ae55580b885a4da8cb06b97ad257e835f964788876c444812bbb5bf958d" Mar 17 01:27:14 crc kubenswrapper[4755]: I0317 01:27:14.249228 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:27:14 crc kubenswrapper[4755]: E0317 01:27:14.250390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:27:28 crc kubenswrapper[4755]: I0317 01:27:28.249571 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:27:28 crc kubenswrapper[4755]: E0317 01:27:28.250297 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:27:42 crc kubenswrapper[4755]: I0317 01:27:42.249994 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:27:42 crc kubenswrapper[4755]: E0317 01:27:42.250676 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:27:57 crc kubenswrapper[4755]: I0317 01:27:57.249157 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:27:57 crc kubenswrapper[4755]: E0317 01:27:57.250260 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.177578 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561848-hcl5w"] Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.180271 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.183370 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.183535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.183924 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.215580 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-hcl5w"] Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.341577 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2xpm\" (UniqueName: \"kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm\") pod \"auto-csr-approver-29561848-hcl5w\" (UID: \"472a8684-6cf5-4a85-935f-47b1a188bf5d\") " pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.443862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2xpm\" (UniqueName: \"kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm\") pod \"auto-csr-approver-29561848-hcl5w\" (UID: \"472a8684-6cf5-4a85-935f-47b1a188bf5d\") " pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.466736 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2xpm\" (UniqueName: \"kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm\") pod \"auto-csr-approver-29561848-hcl5w\" (UID: \"472a8684-6cf5-4a85-935f-47b1a188bf5d\") " pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:00 crc kubenswrapper[4755]: I0317 01:28:00.510001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:01 crc kubenswrapper[4755]: I0317 01:28:01.031887 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-hcl5w"] Mar 17 01:28:01 crc kubenswrapper[4755]: I0317 01:28:01.403475 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" event={"ID":"472a8684-6cf5-4a85-935f-47b1a188bf5d","Type":"ContainerStarted","Data":"1a87d3a36014f73735bde2d7b6d5820c45018149f285f7e059f0c1cd1933475a"} Mar 17 01:28:03 crc kubenswrapper[4755]: I0317 01:28:03.434971 4755 generic.go:334] "Generic (PLEG): container finished" podID="472a8684-6cf5-4a85-935f-47b1a188bf5d" containerID="e4afcb9dd7aebbe4689d79aa79dd0357cc79ecd45500c9661ef925f2df750a4c" exitCode=0 Mar 17 01:28:03 crc kubenswrapper[4755]: I0317 01:28:03.435102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" event={"ID":"472a8684-6cf5-4a85-935f-47b1a188bf5d","Type":"ContainerDied","Data":"e4afcb9dd7aebbe4689d79aa79dd0357cc79ecd45500c9661ef925f2df750a4c"} Mar 17 01:28:04 crc kubenswrapper[4755]: I0317 01:28:04.926090 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.050986 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2xpm\" (UniqueName: \"kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm\") pod \"472a8684-6cf5-4a85-935f-47b1a188bf5d\" (UID: \"472a8684-6cf5-4a85-935f-47b1a188bf5d\") " Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.057102 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm" (OuterVolumeSpecName: "kube-api-access-g2xpm") pod "472a8684-6cf5-4a85-935f-47b1a188bf5d" (UID: "472a8684-6cf5-4a85-935f-47b1a188bf5d"). InnerVolumeSpecName "kube-api-access-g2xpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.155583 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2xpm\" (UniqueName: \"kubernetes.io/projected/472a8684-6cf5-4a85-935f-47b1a188bf5d-kube-api-access-g2xpm\") on node \"crc\" DevicePath \"\"" Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.463104 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" event={"ID":"472a8684-6cf5-4a85-935f-47b1a188bf5d","Type":"ContainerDied","Data":"1a87d3a36014f73735bde2d7b6d5820c45018149f285f7e059f0c1cd1933475a"} Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.463567 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a87d3a36014f73735bde2d7b6d5820c45018149f285f7e059f0c1cd1933475a" Mar 17 01:28:05 crc kubenswrapper[4755]: I0317 01:28:05.463163 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561848-hcl5w" Mar 17 01:28:05 crc kubenswrapper[4755]: E0317 01:28:05.698212 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472a8684_6cf5_4a85_935f_47b1a188bf5d.slice/crio-1a87d3a36014f73735bde2d7b6d5820c45018149f285f7e059f0c1cd1933475a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472a8684_6cf5_4a85_935f_47b1a188bf5d.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:28:06 crc kubenswrapper[4755]: I0317 01:28:06.015638 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-fbz8p"] Mar 17 01:28:06 crc kubenswrapper[4755]: I0317 01:28:06.031715 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561842-fbz8p"] Mar 17 01:28:06 crc kubenswrapper[4755]: I0317 01:28:06.269834 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70715e34-4a49-41ba-a9ef-64be7b21ebbe" path="/var/lib/kubelet/pods/70715e34-4a49-41ba-a9ef-64be7b21ebbe/volumes" Mar 17 01:28:09 crc kubenswrapper[4755]: I0317 01:28:09.248279 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:28:09 crc kubenswrapper[4755]: E0317 01:28:09.249106 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:22 crc kubenswrapper[4755]: I0317 01:28:22.249563 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:28:22 crc kubenswrapper[4755]: E0317 01:28:22.250979 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:34 crc kubenswrapper[4755]: I0317 01:28:34.248834 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:28:34 crc kubenswrapper[4755]: E0317 01:28:34.249920 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:45 crc kubenswrapper[4755]: I0317 01:28:45.248089 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:28:45 crc kubenswrapper[4755]: E0317 01:28:45.248899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:57 crc kubenswrapper[4755]: I0317 01:28:57.247797 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:28:57 crc kubenswrapper[4755]: E0317 01:28:57.248549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:28:58 crc kubenswrapper[4755]: I0317 01:28:58.973494 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:28:58 crc kubenswrapper[4755]: E0317 01:28:58.974498 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a8684-6cf5-4a85-935f-47b1a188bf5d" containerName="oc" Mar 17 01:28:58 crc kubenswrapper[4755]: I0317 01:28:58.974515 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a8684-6cf5-4a85-935f-47b1a188bf5d" containerName="oc" Mar 17 01:28:58 crc kubenswrapper[4755]: I0317 01:28:58.974767 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a8684-6cf5-4a85-935f-47b1a188bf5d" containerName="oc" Mar 17 01:28:58 crc kubenswrapper[4755]: I0317 01:28:58.976572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:28:58 crc kubenswrapper[4755]: I0317 01:28:58.976681 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.099991 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.100042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdpb6\" (UniqueName: \"kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.100146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.202659 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdpb6\" (UniqueName: \"kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.202796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.202910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.203416 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.203477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.223718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdpb6\" (UniqueName: \"kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6\") pod \"redhat-marketplace-6vh6p\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.310783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:28:59 crc kubenswrapper[4755]: I0317 01:28:59.817897 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:29:00 crc kubenswrapper[4755]: I0317 01:29:00.134849 4755 generic.go:334] "Generic (PLEG): container finished" podID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerID="9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591" exitCode=0 Mar 17 01:29:00 crc kubenswrapper[4755]: I0317 01:29:00.134896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerDied","Data":"9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591"} Mar 17 01:29:00 crc kubenswrapper[4755]: I0317 01:29:00.134945 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerStarted","Data":"f42b69dac5fd6b3fab0116c47a7e203bd129eec284ba9b767ca8733fb267eb98"} Mar 17 01:29:01 crc kubenswrapper[4755]: I0317 01:29:01.158524 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerStarted","Data":"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40"} Mar 17 01:29:02 crc kubenswrapper[4755]: I0317 01:29:02.181748 4755 generic.go:334] "Generic (PLEG): container finished" podID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerID="d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40" exitCode=0 Mar 17 01:29:02 crc kubenswrapper[4755]: I0317 01:29:02.181828 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerDied","Data":"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40"} Mar 17 01:29:02 crc kubenswrapper[4755]: I0317 01:29:02.185492 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:29:03 crc kubenswrapper[4755]: I0317 01:29:03.197752 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerStarted","Data":"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f"} Mar 17 01:29:03 crc kubenswrapper[4755]: I0317 01:29:03.232603 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vh6p" podStartSLOduration=2.7726886200000003 podStartE2EDuration="5.232581127s" podCreationTimestamp="2026-03-17 01:28:58 +0000 UTC" firstStartedPulling="2026-03-17 01:29:00.137647134 +0000 UTC m=+4014.897099427" lastFinishedPulling="2026-03-17 01:29:02.597539641 +0000 UTC m=+4017.356991934" observedRunningTime="2026-03-17 01:29:03.225584398 +0000 UTC m=+4017.985036751" watchObservedRunningTime="2026-03-17 01:29:03.232581127 +0000 UTC m=+4017.992033420" Mar 17 01:29:03 crc kubenswrapper[4755]: I0317 01:29:03.592653 4755 scope.go:117] "RemoveContainer" containerID="34a76bf9817658e616d792325755fc68e5e53544b0580124761732a3e705a19e" Mar 17 01:29:08 crc kubenswrapper[4755]: I0317 01:29:08.249233 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:29:08 crc kubenswrapper[4755]: E0317 01:29:08.250208 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:29:09 crc kubenswrapper[4755]: I0317 01:29:09.311908 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:09 crc kubenswrapper[4755]: I0317 01:29:09.311951 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:09 crc kubenswrapper[4755]: I0317 01:29:09.396528 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:10 crc kubenswrapper[4755]: I0317 01:29:10.321459 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:10 crc kubenswrapper[4755]: I0317 01:29:10.370347 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.291493 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vh6p" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="registry-server" containerID="cri-o://1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f" gracePeriod=2 Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.793840 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.926478 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content\") pod \"b78326c1-2e87-42b3-ad89-ca8be7acd252\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.926601 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities\") pod \"b78326c1-2e87-42b3-ad89-ca8be7acd252\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.926804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdpb6\" (UniqueName: \"kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6\") pod \"b78326c1-2e87-42b3-ad89-ca8be7acd252\" (UID: \"b78326c1-2e87-42b3-ad89-ca8be7acd252\") " Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.928238 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities" (OuterVolumeSpecName: "utilities") pod "b78326c1-2e87-42b3-ad89-ca8be7acd252" (UID: "b78326c1-2e87-42b3-ad89-ca8be7acd252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.936653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6" (OuterVolumeSpecName: "kube-api-access-pdpb6") pod "b78326c1-2e87-42b3-ad89-ca8be7acd252" (UID: "b78326c1-2e87-42b3-ad89-ca8be7acd252"). InnerVolumeSpecName "kube-api-access-pdpb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:12 crc kubenswrapper[4755]: I0317 01:29:12.955777 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b78326c1-2e87-42b3-ad89-ca8be7acd252" (UID: "b78326c1-2e87-42b3-ad89-ca8be7acd252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.032617 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdpb6\" (UniqueName: \"kubernetes.io/projected/b78326c1-2e87-42b3-ad89-ca8be7acd252-kube-api-access-pdpb6\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.032671 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.032684 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78326c1-2e87-42b3-ad89-ca8be7acd252-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.309890 4755 generic.go:334] "Generic (PLEG): container finished" podID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerID="1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f" exitCode=0 Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.309963 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vh6p" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.309991 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerDied","Data":"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f"} Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.310598 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vh6p" event={"ID":"b78326c1-2e87-42b3-ad89-ca8be7acd252","Type":"ContainerDied","Data":"f42b69dac5fd6b3fab0116c47a7e203bd129eec284ba9b767ca8733fb267eb98"} Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.310627 4755 scope.go:117] "RemoveContainer" containerID="1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.347015 4755 scope.go:117] "RemoveContainer" containerID="d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.395678 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.403910 4755 scope.go:117] "RemoveContainer" containerID="9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.415686 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vh6p"] Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.430619 4755 scope.go:117] "RemoveContainer" containerID="1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f" Mar 17 01:29:13 crc kubenswrapper[4755]: E0317 01:29:13.433710 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f\": container with ID starting with 1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f not found: ID does not exist" containerID="1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.433765 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f"} err="failed to get container status \"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f\": rpc error: code = NotFound desc = could not find container \"1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f\": container with ID starting with 1ca9f49d7b733c572628c91f02f58344258f2d428ca74ada458034238f891c8f not found: ID does not exist" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.433798 4755 scope.go:117] "RemoveContainer" containerID="d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40" Mar 17 01:29:13 crc kubenswrapper[4755]: E0317 01:29:13.434367 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40\": container with ID starting with d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40 not found: ID does not exist" containerID="d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.434408 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40"} err="failed to get container status \"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40\": rpc error: code = NotFound desc = could not find container \"d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40\": container with ID starting with d41b6938844e0af0cb9d3ff446c58200b25d812aac3aef6cbdcc5629b876dd40 not found: ID does not exist" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.434481 4755 scope.go:117] "RemoveContainer" containerID="9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591" Mar 17 01:29:13 crc kubenswrapper[4755]: E0317 01:29:13.434793 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591\": container with ID starting with 9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591 not found: ID does not exist" containerID="9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591" Mar 17 01:29:13 crc kubenswrapper[4755]: I0317 01:29:13.434829 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591"} err="failed to get container status \"9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591\": rpc error: code = NotFound desc = could not find container \"9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591\": container with ID starting with 9afaa5240f2f7f653c9f1cf83582fd17b97a9a258b21e5f8b456c2d79335b591 not found: ID does not exist" Mar 17 01:29:14 crc kubenswrapper[4755]: I0317 01:29:14.263741 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" path="/var/lib/kubelet/pods/b78326c1-2e87-42b3-ad89-ca8be7acd252/volumes" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.306629 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:29:18 crc kubenswrapper[4755]: E0317 01:29:18.307636 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="extract-utilities" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.307651 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="extract-utilities" Mar 17 01:29:18 crc kubenswrapper[4755]: E0317 01:29:18.307677 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="registry-server" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.307687 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="registry-server" Mar 17 01:29:18 crc kubenswrapper[4755]: E0317 01:29:18.307705 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="extract-content" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.307713 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="extract-content" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.307982 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78326c1-2e87-42b3-ad89-ca8be7acd252" containerName="registry-server" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.309750 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.338839 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.466186 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thcpq\" (UniqueName: \"kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.466556 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.466606 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.568097 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thcpq\" (UniqueName: \"kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.568212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.568290 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.568831 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.568950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.595586 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thcpq\" (UniqueName: \"kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq\") pod \"certified-operators-r2vmw\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:18 crc kubenswrapper[4755]: I0317 01:29:18.645573 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:19 crc kubenswrapper[4755]: I0317 01:29:19.112486 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:29:19 crc kubenswrapper[4755]: I0317 01:29:19.248986 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:29:19 crc kubenswrapper[4755]: E0317 01:29:19.249249 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:29:19 crc kubenswrapper[4755]: I0317 01:29:19.381499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerStarted","Data":"af8267a85848d58fc5dd1005b6bdc3c3db1a8ec032e653b5cc7ce46ac5ec56be"} Mar 17 01:29:20 crc kubenswrapper[4755]: I0317 01:29:20.393918 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerID="6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5" exitCode=0 Mar 17 01:29:20 crc kubenswrapper[4755]: I0317 01:29:20.394030 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerDied","Data":"6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5"} Mar 17 01:29:27 crc kubenswrapper[4755]: I0317 01:29:27.486103 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerID="4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7" exitCode=0 Mar 17 01:29:27 crc kubenswrapper[4755]: I0317 01:29:27.486722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerDied","Data":"4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7"} Mar 17 01:29:28 crc kubenswrapper[4755]: I0317 01:29:28.502225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerStarted","Data":"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4"} Mar 17 01:29:28 crc kubenswrapper[4755]: I0317 01:29:28.540721 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r2vmw" podStartSLOduration=3.017888667 podStartE2EDuration="10.54069344s" podCreationTimestamp="2026-03-17 01:29:18 +0000 UTC" firstStartedPulling="2026-03-17 01:29:20.39574538 +0000 UTC m=+4035.155197703" lastFinishedPulling="2026-03-17 01:29:27.918550203 +0000 UTC m=+4042.678002476" observedRunningTime="2026-03-17 01:29:28.525792938 +0000 UTC m=+4043.285245231" watchObservedRunningTime="2026-03-17 01:29:28.54069344 +0000 UTC m=+4043.300145763" Mar 17 01:29:28 crc kubenswrapper[4755]: I0317 01:29:28.645912 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:28 crc kubenswrapper[4755]: I0317 01:29:28.645966 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:29 crc kubenswrapper[4755]: I0317 01:29:29.689597 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-r2vmw" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="registry-server" probeResult="failure" output=< Mar 17 01:29:29 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:29:29 crc kubenswrapper[4755]: > Mar 17 01:29:32 crc kubenswrapper[4755]: I0317 01:29:32.247930 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:29:32 crc kubenswrapper[4755]: E0317 01:29:32.248549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:29:38 crc kubenswrapper[4755]: I0317 01:29:38.695906 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:38 crc kubenswrapper[4755]: I0317 01:29:38.775407 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:29:38 crc kubenswrapper[4755]: I0317 01:29:38.886350 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:29:38 crc kubenswrapper[4755]: I0317 01:29:38.952948 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 01:29:38 crc kubenswrapper[4755]: I0317 01:29:38.953198 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bc8sz" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="registry-server" containerID="cri-o://259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982" gracePeriod=2 Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.485366 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.500725 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbm4r\" (UniqueName: \"kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r\") pod \"bf109104-00dd-4525-a6e2-31cfccf54d5d\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.500803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities\") pod \"bf109104-00dd-4525-a6e2-31cfccf54d5d\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.501198 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content\") pod \"bf109104-00dd-4525-a6e2-31cfccf54d5d\" (UID: \"bf109104-00dd-4525-a6e2-31cfccf54d5d\") " Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.501639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities" (OuterVolumeSpecName: "utilities") pod "bf109104-00dd-4525-a6e2-31cfccf54d5d" (UID: "bf109104-00dd-4525-a6e2-31cfccf54d5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.502027 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.537760 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r" (OuterVolumeSpecName: "kube-api-access-fbm4r") pod "bf109104-00dd-4525-a6e2-31cfccf54d5d" (UID: "bf109104-00dd-4525-a6e2-31cfccf54d5d"). InnerVolumeSpecName "kube-api-access-fbm4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.589552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf109104-00dd-4525-a6e2-31cfccf54d5d" (UID: "bf109104-00dd-4525-a6e2-31cfccf54d5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.604048 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf109104-00dd-4525-a6e2-31cfccf54d5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.604091 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbm4r\" (UniqueName: \"kubernetes.io/projected/bf109104-00dd-4525-a6e2-31cfccf54d5d-kube-api-access-fbm4r\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.637574 4755 generic.go:334] "Generic (PLEG): container finished" podID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerID="259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982" exitCode=0 Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.637643 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bc8sz" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.637636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerDied","Data":"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982"} Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.637750 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bc8sz" event={"ID":"bf109104-00dd-4525-a6e2-31cfccf54d5d","Type":"ContainerDied","Data":"491b775dc469d3a6a501c51ec9684cae0274d949757c215a345e2b3e95984910"} Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.637775 4755 scope.go:117] "RemoveContainer" containerID="259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.665793 4755 scope.go:117] "RemoveContainer" containerID="f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.677042 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.688270 4755 scope.go:117] "RemoveContainer" containerID="980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.691628 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bc8sz"] Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.745482 4755 scope.go:117] "RemoveContainer" containerID="259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982" Mar 17 01:29:39 crc kubenswrapper[4755]: E0317 01:29:39.746003 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982\": container with ID starting with 259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982 not found: ID does not exist" containerID="259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.746057 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982"} err="failed to get container status \"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982\": rpc error: code = NotFound desc = could not find container \"259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982\": container with ID starting with 259823c942f58b9015080d97d5a7643d7bd99ce6cb7e9302cb7589714f0fe982 not found: ID does not exist" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.746084 4755 scope.go:117] "RemoveContainer" containerID="f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21" Mar 17 01:29:39 crc kubenswrapper[4755]: E0317 01:29:39.746366 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21\": container with ID starting with f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21 not found: ID does not exist" containerID="f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.746404 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21"} err="failed to get container status \"f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21\": rpc error: code = NotFound desc = could not find container \"f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21\": container with ID starting with f5d6ddd3b7ec9031fa9058fb9f208c0e131bece09442695b1a77445b1afd0a21 not found: ID does not exist" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.746432 4755 scope.go:117] "RemoveContainer" containerID="980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118" Mar 17 01:29:39 crc kubenswrapper[4755]: E0317 01:29:39.746700 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118\": container with ID starting with 980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118 not found: ID does not exist" containerID="980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118" Mar 17 01:29:39 crc kubenswrapper[4755]: I0317 01:29:39.746721 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118"} err="failed to get container status \"980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118\": rpc error: code = NotFound desc = could not find container \"980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118\": container with ID starting with 980e6a5fe97e2af7c8c06bd1426d31315ac1666dfb0e475fe6ff0cdc1d36a118 not found: ID does not exist" Mar 17 01:29:40 crc kubenswrapper[4755]: I0317 01:29:40.269048 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" path="/var/lib/kubelet/pods/bf109104-00dd-4525-a6e2-31cfccf54d5d/volumes" Mar 17 01:29:43 crc kubenswrapper[4755]: I0317 01:29:43.249190 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:29:43 crc kubenswrapper[4755]: E0317 01:29:43.249913 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:29:43 crc kubenswrapper[4755]: I0317 01:29:43.684166 4755 generic.go:334] "Generic (PLEG): container finished" podID="d0c84d8b-60dc-4e23-a4be-83b81c52f10f" containerID="7f82c5fda6dff242f42a6e8902cd2dc6c9e680bd763c9d2132312c3e738ae693" exitCode=0 Mar 17 01:29:43 crc kubenswrapper[4755]: I0317 01:29:43.684213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" event={"ID":"d0c84d8b-60dc-4e23-a4be-83b81c52f10f","Type":"ContainerDied","Data":"7f82c5fda6dff242f42a6e8902cd2dc6c9e680bd763c9d2132312c3e738ae693"} Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.307394 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436604 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436740 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436797 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436858 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436893 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436914 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzmn2\" (UniqueName: \"kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.436996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.437071 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.437122 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3\") pod \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\" (UID: \"d0c84d8b-60dc-4e23-a4be-83b81c52f10f\") " Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.444215 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2" (OuterVolumeSpecName: "kube-api-access-bzmn2") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "kube-api-access-bzmn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.470636 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.487165 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.488047 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.496639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph" (OuterVolumeSpecName: "ceph") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.502859 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.504241 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.509720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.515908 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.516855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.518157 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory" (OuterVolumeSpecName: "inventory") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.519081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.530738 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d0c84d8b-60dc-4e23-a4be-83b81c52f10f" (UID: "d0c84d8b-60dc-4e23-a4be-83b81c52f10f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.539993 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540028 4755 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540038 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540048 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540056 4755 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540067 4755 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540078 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540087 4755 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540095 4755 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540103 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzmn2\" (UniqueName: \"kubernetes.io/projected/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-kube-api-access-bzmn2\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540113 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540120 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.540129 4755 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d0c84d8b-60dc-4e23-a4be-83b81c52f10f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.719071 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" event={"ID":"d0c84d8b-60dc-4e23-a4be-83b81c52f10f","Type":"ContainerDied","Data":"99ae71f2f86dbb92635a16354f01251504c6564545675afdb834d9fd8cb29730"} Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.719164 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ae71f2f86dbb92635a16354f01251504c6564545675afdb834d9fd8cb29730" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.719299 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.847607 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h"] Mar 17 01:29:45 crc kubenswrapper[4755]: E0317 01:29:45.847984 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="extract-utilities" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848001 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="extract-utilities" Mar 17 01:29:45 crc kubenswrapper[4755]: E0317 01:29:45.848018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="extract-content" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848027 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="extract-content" Mar 17 01:29:45 crc kubenswrapper[4755]: E0317 01:29:45.848044 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="registry-server" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848050 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="registry-server" Mar 17 01:29:45 crc kubenswrapper[4755]: E0317 01:29:45.848065 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c84d8b-60dc-4e23-a4be-83b81c52f10f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848072 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c84d8b-60dc-4e23-a4be-83b81c52f10f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848265 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf109104-00dd-4525-a6e2-31cfccf54d5d" containerName="registry-server" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848289 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c84d8b-60dc-4e23-a4be-83b81c52f10f" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.848959 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.853487 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.853833 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.854070 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.854365 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.854645 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.854935 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.878732 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h"] Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952661 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.952985 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.953212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:45 crc kubenswrapper[4755]: I0317 01:29:45.953404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czsx8\" (UniqueName: \"kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czsx8\" (UniqueName: \"kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055575 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055656 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055705 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.055751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.060418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.060459 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.060732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.061942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.062066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.062115 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.063912 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.073852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czsx8\" (UniqueName: \"kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.174208 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:29:46 crc kubenswrapper[4755]: I0317 01:29:46.762362 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h"] Mar 17 01:29:47 crc kubenswrapper[4755]: I0317 01:29:47.739174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" event={"ID":"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109","Type":"ContainerStarted","Data":"f54050a4a24e1bdfe74b3fbc6bf0abda670283ab103eaa7be029bec1e3cc58ee"} Mar 17 01:29:47 crc kubenswrapper[4755]: I0317 01:29:47.739759 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" event={"ID":"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109","Type":"ContainerStarted","Data":"cd98770c0964a3343fb92c9055ddc2e2c179a178dc3735719a525209a09898cc"} Mar 17 01:29:47 crc kubenswrapper[4755]: I0317 01:29:47.771463 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" podStartSLOduration=2.328458781 podStartE2EDuration="2.771413985s" podCreationTimestamp="2026-03-17 01:29:45 +0000 UTC" firstStartedPulling="2026-03-17 01:29:46.762742656 +0000 UTC m=+4061.522194939" lastFinishedPulling="2026-03-17 01:29:47.20569786 +0000 UTC m=+4061.965150143" observedRunningTime="2026-03-17 01:29:47.763252074 +0000 UTC m=+4062.522704357" watchObservedRunningTime="2026-03-17 01:29:47.771413985 +0000 UTC m=+4062.530866288" Mar 17 01:29:56 crc kubenswrapper[4755]: I0317 01:29:56.283170 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:29:56 crc kubenswrapper[4755]: E0317 01:29:56.284576 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.169888 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561850-hlrqm"] Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.173783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.177599 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.178321 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.178964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.196985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj"] Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.201596 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.204860 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.205076 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.242956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-hlrqm"] Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.271891 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj"] Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.325985 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.326135 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22pk\" (UniqueName: \"kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.326458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjjc7\" (UniqueName: \"kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7\") pod \"auto-csr-approver-29561850-hlrqm\" (UID: \"d18e884a-6fc0-49f4-9824-8d6fec65fa9a\") " pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.326823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.429420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.429555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.429707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22pk\" (UniqueName: \"kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.429835 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjjc7\" (UniqueName: \"kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7\") pod \"auto-csr-approver-29561850-hlrqm\" (UID: \"d18e884a-6fc0-49f4-9824-8d6fec65fa9a\") " pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.431026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.438148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.457151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjjc7\" (UniqueName: \"kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7\") pod \"auto-csr-approver-29561850-hlrqm\" (UID: \"d18e884a-6fc0-49f4-9824-8d6fec65fa9a\") " pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.458067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22pk\" (UniqueName: \"kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk\") pod \"collect-profiles-29561850-4x7zj\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.503996 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.521917 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:00 crc kubenswrapper[4755]: I0317 01:30:00.980838 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-hlrqm"] Mar 17 01:30:01 crc kubenswrapper[4755]: I0317 01:30:01.136326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj"] Mar 17 01:30:01 crc kubenswrapper[4755]: I0317 01:30:01.919143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" event={"ID":"d18e884a-6fc0-49f4-9824-8d6fec65fa9a","Type":"ContainerStarted","Data":"c315acb70464001d303eef189bc3852f114dead096c72e90280193dc005e70be"} Mar 17 01:30:01 crc kubenswrapper[4755]: I0317 01:30:01.924085 4755 generic.go:334] "Generic (PLEG): container finished" podID="f27dace0-fabc-4954-b257-8c7911349adc" containerID="b4907f437e505aaf6fd8cb7800bc64ea926a4385da834c2b5d9e06b4f40803c0" exitCode=0 Mar 17 01:30:01 crc kubenswrapper[4755]: I0317 01:30:01.924346 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" event={"ID":"f27dace0-fabc-4954-b257-8c7911349adc","Type":"ContainerDied","Data":"b4907f437e505aaf6fd8cb7800bc64ea926a4385da834c2b5d9e06b4f40803c0"} Mar 17 01:30:01 crc kubenswrapper[4755]: I0317 01:30:01.924586 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" event={"ID":"f27dace0-fabc-4954-b257-8c7911349adc","Type":"ContainerStarted","Data":"1b349f1eaf3caa136ba830ab44cb3c42b902f1506cc003c281e41d31c22e8ebe"} Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.362171 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.510503 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume\") pod \"f27dace0-fabc-4954-b257-8c7911349adc\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.510555 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22pk\" (UniqueName: \"kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk\") pod \"f27dace0-fabc-4954-b257-8c7911349adc\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.510727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume\") pod \"f27dace0-fabc-4954-b257-8c7911349adc\" (UID: \"f27dace0-fabc-4954-b257-8c7911349adc\") " Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.511634 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume" (OuterVolumeSpecName: "config-volume") pod "f27dace0-fabc-4954-b257-8c7911349adc" (UID: "f27dace0-fabc-4954-b257-8c7911349adc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.516496 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk" (OuterVolumeSpecName: "kube-api-access-t22pk") pod "f27dace0-fabc-4954-b257-8c7911349adc" (UID: "f27dace0-fabc-4954-b257-8c7911349adc"). InnerVolumeSpecName "kube-api-access-t22pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.523659 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f27dace0-fabc-4954-b257-8c7911349adc" (UID: "f27dace0-fabc-4954-b257-8c7911349adc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.613338 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f27dace0-fabc-4954-b257-8c7911349adc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.613381 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22pk\" (UniqueName: \"kubernetes.io/projected/f27dace0-fabc-4954-b257-8c7911349adc-kube-api-access-t22pk\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.613402 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f27dace0-fabc-4954-b257-8c7911349adc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.953502 4755 generic.go:334] "Generic (PLEG): container finished" podID="d18e884a-6fc0-49f4-9824-8d6fec65fa9a" containerID="4fe7fd301c6c16d64a44993c6a4e44f26fe0c5db5b0d9e4964ff95732c53aca8" exitCode=0 Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.953588 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" event={"ID":"d18e884a-6fc0-49f4-9824-8d6fec65fa9a","Type":"ContainerDied","Data":"4fe7fd301c6c16d64a44993c6a4e44f26fe0c5db5b0d9e4964ff95732c53aca8"} Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.955318 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" event={"ID":"f27dace0-fabc-4954-b257-8c7911349adc","Type":"ContainerDied","Data":"1b349f1eaf3caa136ba830ab44cb3c42b902f1506cc003c281e41d31c22e8ebe"} Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.955351 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b349f1eaf3caa136ba830ab44cb3c42b902f1506cc003c281e41d31c22e8ebe" Mar 17 01:30:03 crc kubenswrapper[4755]: I0317 01:30:03.955411 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj" Mar 17 01:30:04 crc kubenswrapper[4755]: I0317 01:30:04.459258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2"] Mar 17 01:30:04 crc kubenswrapper[4755]: I0317 01:30:04.470916 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561805-zrbx2"] Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.466541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.562900 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjjc7\" (UniqueName: \"kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7\") pod \"d18e884a-6fc0-49f4-9824-8d6fec65fa9a\" (UID: \"d18e884a-6fc0-49f4-9824-8d6fec65fa9a\") " Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.568410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7" (OuterVolumeSpecName: "kube-api-access-jjjc7") pod "d18e884a-6fc0-49f4-9824-8d6fec65fa9a" (UID: "d18e884a-6fc0-49f4-9824-8d6fec65fa9a"). InnerVolumeSpecName "kube-api-access-jjjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.667008 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjjc7\" (UniqueName: \"kubernetes.io/projected/d18e884a-6fc0-49f4-9824-8d6fec65fa9a-kube-api-access-jjjc7\") on node \"crc\" DevicePath \"\"" Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.984800 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" event={"ID":"d18e884a-6fc0-49f4-9824-8d6fec65fa9a","Type":"ContainerDied","Data":"c315acb70464001d303eef189bc3852f114dead096c72e90280193dc005e70be"} Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.984845 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c315acb70464001d303eef189bc3852f114dead096c72e90280193dc005e70be" Mar 17 01:30:05 crc kubenswrapper[4755]: I0317 01:30:05.984887 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561850-hlrqm" Mar 17 01:30:06 crc kubenswrapper[4755]: I0317 01:30:06.268131 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8ead77-0763-41b7-8a6f-96f43f4d202b" path="/var/lib/kubelet/pods/2b8ead77-0763-41b7-8a6f-96f43f4d202b/volumes" Mar 17 01:30:06 crc kubenswrapper[4755]: I0317 01:30:06.552126 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-ztz95"] Mar 17 01:30:06 crc kubenswrapper[4755]: I0317 01:30:06.563297 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561844-ztz95"] Mar 17 01:30:07 crc kubenswrapper[4755]: I0317 01:30:07.249099 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:30:07 crc kubenswrapper[4755]: E0317 01:30:07.250267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:30:08 crc kubenswrapper[4755]: I0317 01:30:08.269698 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d89fd23-639a-4219-b543-fbde76ca3626" path="/var/lib/kubelet/pods/0d89fd23-639a-4219-b543-fbde76ca3626/volumes" Mar 17 01:30:18 crc kubenswrapper[4755]: I0317 01:30:18.250268 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:30:18 crc kubenswrapper[4755]: E0317 01:30:18.251372 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:30:32 crc kubenswrapper[4755]: I0317 01:30:32.249293 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:30:33 crc kubenswrapper[4755]: I0317 01:30:33.382768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6"} Mar 17 01:31:03 crc kubenswrapper[4755]: I0317 01:31:03.752646 4755 scope.go:117] "RemoveContainer" containerID="64d8c797f1fad0db5e2b0fb903bf4587206c5333056a46f0d356964d6edbc825" Mar 17 01:31:03 crc kubenswrapper[4755]: I0317 01:31:03.812692 4755 scope.go:117] "RemoveContainer" containerID="d0c51436242d644ae2d3f6df75d794cb078867d9ca0c8872b3fb8770f317caa5" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.177232 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561852-gblns"] Mar 17 01:32:00 crc kubenswrapper[4755]: E0317 01:32:00.178372 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18e884a-6fc0-49f4-9824-8d6fec65fa9a" containerName="oc" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.178386 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18e884a-6fc0-49f4-9824-8d6fec65fa9a" containerName="oc" Mar 17 01:32:00 crc kubenswrapper[4755]: E0317 01:32:00.178412 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27dace0-fabc-4954-b257-8c7911349adc" containerName="collect-profiles" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.178421 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27dace0-fabc-4954-b257-8c7911349adc" containerName="collect-profiles" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.178709 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27dace0-fabc-4954-b257-8c7911349adc" containerName="collect-profiles" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.178746 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18e884a-6fc0-49f4-9824-8d6fec65fa9a" containerName="oc" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.179663 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.182212 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.182340 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.183365 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.188210 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-gblns"] Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.230310 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkrs\" (UniqueName: \"kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs\") pod \"auto-csr-approver-29561852-gblns\" (UID: \"83be5c50-751a-4035-b4ed-5c146f257516\") " pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.332893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkrs\" (UniqueName: \"kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs\") pod \"auto-csr-approver-29561852-gblns\" (UID: \"83be5c50-751a-4035-b4ed-5c146f257516\") " pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.366242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkrs\" (UniqueName: \"kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs\") pod \"auto-csr-approver-29561852-gblns\" (UID: \"83be5c50-751a-4035-b4ed-5c146f257516\") " pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.527299 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:00 crc kubenswrapper[4755]: I0317 01:32:00.990229 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-gblns"] Mar 17 01:32:01 crc kubenswrapper[4755]: I0317 01:32:01.579690 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-gblns" event={"ID":"83be5c50-751a-4035-b4ed-5c146f257516","Type":"ContainerStarted","Data":"fa5be1a0553fec2d482728f2c68ce657ba23ec2b02546a4a06cef0b4cb1f7acd"} Mar 17 01:32:02 crc kubenswrapper[4755]: I0317 01:32:02.601915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-gblns" event={"ID":"83be5c50-751a-4035-b4ed-5c146f257516","Type":"ContainerStarted","Data":"f4144484dabfd460b690cf4a3b17e7061ae53d51f87fd9d48e284331f7764a81"} Mar 17 01:32:02 crc kubenswrapper[4755]: I0317 01:32:02.633083 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561852-gblns" podStartSLOduration=1.6647729610000002 podStartE2EDuration="2.633069331s" podCreationTimestamp="2026-03-17 01:32:00 +0000 UTC" firstStartedPulling="2026-03-17 01:32:00.989394157 +0000 UTC m=+4195.748846480" lastFinishedPulling="2026-03-17 01:32:01.957690547 +0000 UTC m=+4196.717142850" observedRunningTime="2026-03-17 01:32:02.621308642 +0000 UTC m=+4197.380760925" watchObservedRunningTime="2026-03-17 01:32:02.633069331 +0000 UTC m=+4197.392521604" Mar 17 01:32:03 crc kubenswrapper[4755]: I0317 01:32:03.613692 4755 generic.go:334] "Generic (PLEG): container finished" podID="83be5c50-751a-4035-b4ed-5c146f257516" containerID="f4144484dabfd460b690cf4a3b17e7061ae53d51f87fd9d48e284331f7764a81" exitCode=0 Mar 17 01:32:03 crc kubenswrapper[4755]: I0317 01:32:03.613770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-gblns" event={"ID":"83be5c50-751a-4035-b4ed-5c146f257516","Type":"ContainerDied","Data":"f4144484dabfd460b690cf4a3b17e7061ae53d51f87fd9d48e284331f7764a81"} Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.122695 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.148367 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkrs\" (UniqueName: \"kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs\") pod \"83be5c50-751a-4035-b4ed-5c146f257516\" (UID: \"83be5c50-751a-4035-b4ed-5c146f257516\") " Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.155881 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs" (OuterVolumeSpecName: "kube-api-access-cfkrs") pod "83be5c50-751a-4035-b4ed-5c146f257516" (UID: "83be5c50-751a-4035-b4ed-5c146f257516"). InnerVolumeSpecName "kube-api-access-cfkrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.251176 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkrs\" (UniqueName: \"kubernetes.io/projected/83be5c50-751a-4035-b4ed-5c146f257516-kube-api-access-cfkrs\") on node \"crc\" DevicePath \"\"" Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.638951 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561852-gblns" event={"ID":"83be5c50-751a-4035-b4ed-5c146f257516","Type":"ContainerDied","Data":"fa5be1a0553fec2d482728f2c68ce657ba23ec2b02546a4a06cef0b4cb1f7acd"} Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.639017 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5be1a0553fec2d482728f2c68ce657ba23ec2b02546a4a06cef0b4cb1f7acd" Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.639042 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561852-gblns" Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.735054 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-5rdbg"] Mar 17 01:32:05 crc kubenswrapper[4755]: I0317 01:32:05.748843 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561846-5rdbg"] Mar 17 01:32:06 crc kubenswrapper[4755]: I0317 01:32:06.350945 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c16f6d1-a361-4e74-9763-8da025eb71d4" path="/var/lib/kubelet/pods/4c16f6d1-a361-4e74-9763-8da025eb71d4/volumes" Mar 17 01:32:58 crc kubenswrapper[4755]: I0317 01:32:58.665093 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:32:58 crc kubenswrapper[4755]: I0317 01:32:58.667367 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:33:03 crc kubenswrapper[4755]: I0317 01:33:03.947168 4755 scope.go:117] "RemoveContainer" containerID="9d2bd2449952f9e224b5c28eb90dcb121002b850f1158830362fe4abef8a768d" Mar 17 01:33:11 crc kubenswrapper[4755]: I0317 01:33:11.536655 4755 generic.go:334] "Generic (PLEG): container finished" podID="dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" containerID="f54050a4a24e1bdfe74b3fbc6bf0abda670283ab103eaa7be029bec1e3cc58ee" exitCode=0 Mar 17 01:33:11 crc kubenswrapper[4755]: I0317 01:33:11.536738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" event={"ID":"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109","Type":"ContainerDied","Data":"f54050a4a24e1bdfe74b3fbc6bf0abda670283ab103eaa7be029bec1e3cc58ee"} Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.168454 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.293924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.293987 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czsx8\" (UniqueName: \"kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294510 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294558 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294579 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294662 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.294727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2\") pod \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\" (UID: \"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109\") " Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.302474 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.305018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8" (OuterVolumeSpecName: "kube-api-access-czsx8") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "kube-api-access-czsx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.307367 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph" (OuterVolumeSpecName: "ceph") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.334034 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.342467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.344598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.354554 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory" (OuterVolumeSpecName: "inventory") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.359505 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" (UID: "dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.396952 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.396998 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397016 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397029 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397044 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czsx8\" (UniqueName: \"kubernetes.io/projected/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-kube-api-access-czsx8\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397060 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397072 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.397085 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.564890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" event={"ID":"dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109","Type":"ContainerDied","Data":"cd98770c0964a3343fb92c9055ddc2e2c179a178dc3735719a525209a09898cc"} Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.564950 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd98770c0964a3343fb92c9055ddc2e2c179a178dc3735719a525209a09898cc" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.564963 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.713320 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf"] Mar 17 01:33:13 crc kubenswrapper[4755]: E0317 01:33:13.713902 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83be5c50-751a-4035-b4ed-5c146f257516" containerName="oc" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.713922 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="83be5c50-751a-4035-b4ed-5c146f257516" containerName="oc" Mar 17 01:33:13 crc kubenswrapper[4755]: E0317 01:33:13.713940 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.713948 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.714146 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.714180 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="83be5c50-751a-4035-b4ed-5c146f257516" containerName="oc" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.714923 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719221 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719367 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719457 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719663 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.719765 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.736798 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf"] Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.834802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.834893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.834943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.834984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.835007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.835044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.835063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6p5z\" (UniqueName: \"kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.835094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937559 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937614 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6p5z\" (UniqueName: \"kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937667 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.937817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.944675 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.945917 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.946005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.946339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.947959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.948388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.959741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6p5z\" (UniqueName: \"kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:13 crc kubenswrapper[4755]: I0317 01:33:13.989064 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:14 crc kubenswrapper[4755]: I0317 01:33:14.049742 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:33:14 crc kubenswrapper[4755]: I0317 01:33:14.674121 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf"] Mar 17 01:33:15 crc kubenswrapper[4755]: I0317 01:33:15.608404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" event={"ID":"03684d66-3e86-4168-9a3a-62e40ba5ddce","Type":"ContainerStarted","Data":"54269c69a8d52d8493a705787a1a7abb6e4547691c0af4dfb678bc408c5223db"} Mar 17 01:33:15 crc kubenswrapper[4755]: I0317 01:33:15.609108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" event={"ID":"03684d66-3e86-4168-9a3a-62e40ba5ddce","Type":"ContainerStarted","Data":"b19ab656e763731fd02ae17f8df9bf3d24e6ce5e1b1c63813e08e9b8f3736939"} Mar 17 01:33:15 crc kubenswrapper[4755]: I0317 01:33:15.631603 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" podStartSLOduration=2.065260198 podStartE2EDuration="2.63158381s" podCreationTimestamp="2026-03-17 01:33:13 +0000 UTC" firstStartedPulling="2026-03-17 01:33:14.679509124 +0000 UTC m=+4269.438961397" lastFinishedPulling="2026-03-17 01:33:15.245832716 +0000 UTC m=+4270.005285009" observedRunningTime="2026-03-17 01:33:15.627719545 +0000 UTC m=+4270.387171838" watchObservedRunningTime="2026-03-17 01:33:15.63158381 +0000 UTC m=+4270.391036103" Mar 17 01:33:28 crc kubenswrapper[4755]: I0317 01:33:28.665404 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:33:28 crc kubenswrapper[4755]: I0317 01:33:28.666057 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:33:58 crc kubenswrapper[4755]: I0317 01:33:58.665329 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:33:58 crc kubenswrapper[4755]: I0317 01:33:58.668186 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:33:58 crc kubenswrapper[4755]: I0317 01:33:58.668501 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:33:58 crc kubenswrapper[4755]: I0317 01:33:58.669704 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:33:58 crc kubenswrapper[4755]: I0317 01:33:58.669982 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6" gracePeriod=600 Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.157393 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561854-7mc2k"] Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.160198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.163235 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.167318 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.167479 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.170798 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-7mc2k"] Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.294055 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6" exitCode=0 Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.294105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6"} Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.294140 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f"} Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.294166 4755 scope.go:117] "RemoveContainer" containerID="82821bf98afb4205eaf1631f96db91d0d61c3276610b607d21e15c3f421ed3b3" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.304152 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bwl\" (UniqueName: \"kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl\") pod \"auto-csr-approver-29561854-7mc2k\" (UID: \"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2\") " pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.406928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bwl\" (UniqueName: \"kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl\") pod \"auto-csr-approver-29561854-7mc2k\" (UID: \"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2\") " pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.432863 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bwl\" (UniqueName: \"kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl\") pod \"auto-csr-approver-29561854-7mc2k\" (UID: \"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2\") " pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:00 crc kubenswrapper[4755]: I0317 01:34:00.483636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:01 crc kubenswrapper[4755]: I0317 01:34:01.013409 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-7mc2k"] Mar 17 01:34:02 crc kubenswrapper[4755]: I0317 01:34:02.326062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" event={"ID":"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2","Type":"ContainerStarted","Data":"7d662fba876f939de65ec813e913247220b0c31796d7d4d8cb0933942c8757cc"} Mar 17 01:34:03 crc kubenswrapper[4755]: I0317 01:34:03.340519 4755 generic.go:334] "Generic (PLEG): container finished" podID="8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" containerID="dc443d4a35bd5f9a12b3efec3e4e927fc3944b1a434ecce8ed556b20e890f38f" exitCode=0 Mar 17 01:34:03 crc kubenswrapper[4755]: I0317 01:34:03.340646 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" event={"ID":"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2","Type":"ContainerDied","Data":"dc443d4a35bd5f9a12b3efec3e4e927fc3944b1a434ecce8ed556b20e890f38f"} Mar 17 01:34:04 crc kubenswrapper[4755]: I0317 01:34:04.812052 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:04 crc kubenswrapper[4755]: I0317 01:34:04.918001 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bwl\" (UniqueName: \"kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl\") pod \"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2\" (UID: \"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2\") " Mar 17 01:34:04 crc kubenswrapper[4755]: I0317 01:34:04.923882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl" (OuterVolumeSpecName: "kube-api-access-j2bwl") pod "8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" (UID: "8ee0ca2d-e10a-4a3f-8732-efc1a21240c2"). InnerVolumeSpecName "kube-api-access-j2bwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.021397 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bwl\" (UniqueName: \"kubernetes.io/projected/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2-kube-api-access-j2bwl\") on node \"crc\" DevicePath \"\"" Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.366724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" event={"ID":"8ee0ca2d-e10a-4a3f-8732-efc1a21240c2","Type":"ContainerDied","Data":"7d662fba876f939de65ec813e913247220b0c31796d7d4d8cb0933942c8757cc"} Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.366769 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d662fba876f939de65ec813e913247220b0c31796d7d4d8cb0933942c8757cc" Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.366806 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561854-7mc2k" Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.930299 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-hcl5w"] Mar 17 01:34:05 crc kubenswrapper[4755]: I0317 01:34:05.949747 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561848-hcl5w"] Mar 17 01:34:06 crc kubenswrapper[4755]: I0317 01:34:06.267466 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472a8684-6cf5-4a85-935f-47b1a188bf5d" path="/var/lib/kubelet/pods/472a8684-6cf5-4a85-935f-47b1a188bf5d/volumes" Mar 17 01:35:04 crc kubenswrapper[4755]: I0317 01:35:04.061698 4755 scope.go:117] "RemoveContainer" containerID="e4afcb9dd7aebbe4689d79aa79dd0357cc79ecd45500c9661ef925f2df750a4c" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.203103 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:16 crc kubenswrapper[4755]: E0317 01:35:16.204340 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" containerName="oc" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.204358 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" containerName="oc" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.204697 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" containerName="oc" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.206568 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.241496 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.261608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.261878 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.262012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxg5\" (UniqueName: \"kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.364372 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.364496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.364582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxg5\" (UniqueName: \"kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.365712 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.366466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.393096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxg5\" (UniqueName: \"kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5\") pod \"redhat-operators-jb5pn\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:16 crc kubenswrapper[4755]: I0317 01:35:16.561975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:17 crc kubenswrapper[4755]: I0317 01:35:17.007049 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:17 crc kubenswrapper[4755]: I0317 01:35:17.341925 4755 generic.go:334] "Generic (PLEG): container finished" podID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerID="751c9fb79b5d6a012b12585a823ebd2761f95ddf32be6ed8f52522d7daec8f9a" exitCode=0 Mar 17 01:35:17 crc kubenswrapper[4755]: I0317 01:35:17.341988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerDied","Data":"751c9fb79b5d6a012b12585a823ebd2761f95ddf32be6ed8f52522d7daec8f9a"} Mar 17 01:35:17 crc kubenswrapper[4755]: I0317 01:35:17.342043 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerStarted","Data":"8df108459c6fc7ed597522fe13ccea020a1fdadc6d2093cfd60bb548454da5ae"} Mar 17 01:35:17 crc kubenswrapper[4755]: I0317 01:35:17.344159 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:35:18 crc kubenswrapper[4755]: I0317 01:35:18.355425 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerStarted","Data":"881e7dd0c76ccc73dc3699d2455045d79f69768d8a7bf57c2da41d9fec6888c1"} Mar 17 01:35:23 crc kubenswrapper[4755]: I0317 01:35:23.415346 4755 generic.go:334] "Generic (PLEG): container finished" podID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerID="881e7dd0c76ccc73dc3699d2455045d79f69768d8a7bf57c2da41d9fec6888c1" exitCode=0 Mar 17 01:35:23 crc kubenswrapper[4755]: I0317 01:35:23.415409 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerDied","Data":"881e7dd0c76ccc73dc3699d2455045d79f69768d8a7bf57c2da41d9fec6888c1"} Mar 17 01:35:24 crc kubenswrapper[4755]: I0317 01:35:24.432324 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerStarted","Data":"8e03cb4be27ef7834b56e135bb54464765716d8bd4ead3c8ddf10f5a7a17712c"} Mar 17 01:35:24 crc kubenswrapper[4755]: I0317 01:35:24.470525 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jb5pn" podStartSLOduration=1.978842797 podStartE2EDuration="8.470506617s" podCreationTimestamp="2026-03-17 01:35:16 +0000 UTC" firstStartedPulling="2026-03-17 01:35:17.343915199 +0000 UTC m=+4392.103367482" lastFinishedPulling="2026-03-17 01:35:23.835578989 +0000 UTC m=+4398.595031302" observedRunningTime="2026-03-17 01:35:24.455066159 +0000 UTC m=+4399.214518462" watchObservedRunningTime="2026-03-17 01:35:24.470506617 +0000 UTC m=+4399.229958900" Mar 17 01:35:26 crc kubenswrapper[4755]: I0317 01:35:26.562515 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:26 crc kubenswrapper[4755]: I0317 01:35:26.562964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:27 crc kubenswrapper[4755]: I0317 01:35:27.621431 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jb5pn" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="registry-server" probeResult="failure" output=< Mar 17 01:35:27 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:35:27 crc kubenswrapper[4755]: > Mar 17 01:35:36 crc kubenswrapper[4755]: I0317 01:35:36.641833 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:36 crc kubenswrapper[4755]: I0317 01:35:36.718086 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:36 crc kubenswrapper[4755]: I0317 01:35:36.879928 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:38 crc kubenswrapper[4755]: I0317 01:35:38.612399 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jb5pn" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="registry-server" containerID="cri-o://8e03cb4be27ef7834b56e135bb54464765716d8bd4ead3c8ddf10f5a7a17712c" gracePeriod=2 Mar 17 01:35:39 crc kubenswrapper[4755]: I0317 01:35:39.651308 4755 generic.go:334] "Generic (PLEG): container finished" podID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerID="8e03cb4be27ef7834b56e135bb54464765716d8bd4ead3c8ddf10f5a7a17712c" exitCode=0 Mar 17 01:35:39 crc kubenswrapper[4755]: I0317 01:35:39.651657 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerDied","Data":"8e03cb4be27ef7834b56e135bb54464765716d8bd4ead3c8ddf10f5a7a17712c"} Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.344747 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.473697 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content\") pod \"bfbf3dbb-e539-40e5-a858-3985a6b92502\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.473813 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities\") pod \"bfbf3dbb-e539-40e5-a858-3985a6b92502\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.474002 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzxg5\" (UniqueName: \"kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5\") pod \"bfbf3dbb-e539-40e5-a858-3985a6b92502\" (UID: \"bfbf3dbb-e539-40e5-a858-3985a6b92502\") " Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.474867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities" (OuterVolumeSpecName: "utilities") pod "bfbf3dbb-e539-40e5-a858-3985a6b92502" (UID: "bfbf3dbb-e539-40e5-a858-3985a6b92502"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.483747 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5" (OuterVolumeSpecName: "kube-api-access-mzxg5") pod "bfbf3dbb-e539-40e5-a858-3985a6b92502" (UID: "bfbf3dbb-e539-40e5-a858-3985a6b92502"). InnerVolumeSpecName "kube-api-access-mzxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.577423 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.577488 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzxg5\" (UniqueName: \"kubernetes.io/projected/bfbf3dbb-e539-40e5-a858-3985a6b92502-kube-api-access-mzxg5\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.637266 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfbf3dbb-e539-40e5-a858-3985a6b92502" (UID: "bfbf3dbb-e539-40e5-a858-3985a6b92502"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.665244 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jb5pn" event={"ID":"bfbf3dbb-e539-40e5-a858-3985a6b92502","Type":"ContainerDied","Data":"8df108459c6fc7ed597522fe13ccea020a1fdadc6d2093cfd60bb548454da5ae"} Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.665314 4755 scope.go:117] "RemoveContainer" containerID="8e03cb4be27ef7834b56e135bb54464765716d8bd4ead3c8ddf10f5a7a17712c" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.665310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jb5pn" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.679818 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbf3dbb-e539-40e5-a858-3985a6b92502-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.709512 4755 scope.go:117] "RemoveContainer" containerID="881e7dd0c76ccc73dc3699d2455045d79f69768d8a7bf57c2da41d9fec6888c1" Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.726811 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.740634 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jb5pn"] Mar 17 01:35:40 crc kubenswrapper[4755]: I0317 01:35:40.753354 4755 scope.go:117] "RemoveContainer" containerID="751c9fb79b5d6a012b12585a823ebd2761f95ddf32be6ed8f52522d7daec8f9a" Mar 17 01:35:42 crc kubenswrapper[4755]: I0317 01:35:42.265894 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" path="/var/lib/kubelet/pods/bfbf3dbb-e539-40e5-a858-3985a6b92502/volumes" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.638580 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:35:49 crc kubenswrapper[4755]: E0317 01:35:49.639835 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="registry-server" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.639856 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="registry-server" Mar 17 01:35:49 crc kubenswrapper[4755]: E0317 01:35:49.639926 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="extract-utilities" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.639940 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="extract-utilities" Mar 17 01:35:49 crc kubenswrapper[4755]: E0317 01:35:49.639963 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="extract-content" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.639976 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="extract-content" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.640309 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbf3dbb-e539-40e5-a858-3985a6b92502" containerName="registry-server" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.642618 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.650570 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.786010 4755 generic.go:334] "Generic (PLEG): container finished" podID="03684d66-3e86-4168-9a3a-62e40ba5ddce" containerID="54269c69a8d52d8493a705787a1a7abb6e4547691c0af4dfb678bc408c5223db" exitCode=0 Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.786090 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" event={"ID":"03684d66-3e86-4168-9a3a-62e40ba5ddce","Type":"ContainerDied","Data":"54269c69a8d52d8493a705787a1a7abb6e4547691c0af4dfb678bc408c5223db"} Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.828643 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.828779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.828962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5bh\" (UniqueName: \"kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.931472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5bh\" (UniqueName: \"kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.931701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.931778 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.932285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.932604 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:49 crc kubenswrapper[4755]: I0317 01:35:49.968225 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5bh\" (UniqueName: \"kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh\") pod \"community-operators-9zzrw\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:50 crc kubenswrapper[4755]: I0317 01:35:50.261449 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:35:50 crc kubenswrapper[4755]: I0317 01:35:50.710333 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:35:50 crc kubenswrapper[4755]: W0317 01:35:50.714561 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf950e094_8ad4_4304_93a2_745f5a924e5c.slice/crio-61fdc53453901e601e1f6ab7bdb353421a15096f165a6459ce2e1cb51930bf6e WatchSource:0}: Error finding container 61fdc53453901e601e1f6ab7bdb353421a15096f165a6459ce2e1cb51930bf6e: Status 404 returned error can't find the container with id 61fdc53453901e601e1f6ab7bdb353421a15096f165a6459ce2e1cb51930bf6e Mar 17 01:35:50 crc kubenswrapper[4755]: I0317 01:35:50.804291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerStarted","Data":"61fdc53453901e601e1f6ab7bdb353421a15096f165a6459ce2e1cb51930bf6e"} Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.379893 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569109 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6p5z\" (UniqueName: \"kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569242 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569302 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569372 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569394 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569503 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569534 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.569572 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2\") pod \"03684d66-3e86-4168-9a3a-62e40ba5ddce\" (UID: \"03684d66-3e86-4168-9a3a-62e40ba5ddce\") " Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.577384 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph" (OuterVolumeSpecName: "ceph") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.577418 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z" (OuterVolumeSpecName: "kube-api-access-x6p5z") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "kube-api-access-x6p5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.581089 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.603039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.608195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.625014 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.626594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.634611 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory" (OuterVolumeSpecName: "inventory") pod "03684d66-3e86-4168-9a3a-62e40ba5ddce" (UID: "03684d66-3e86-4168-9a3a-62e40ba5ddce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.672952 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673005 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673018 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673039 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673055 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673069 4755 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673082 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/03684d66-3e86-4168-9a3a-62e40ba5ddce-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.673095 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6p5z\" (UniqueName: \"kubernetes.io/projected/03684d66-3e86-4168-9a3a-62e40ba5ddce-kube-api-access-x6p5z\") on node \"crc\" DevicePath \"\"" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.819908 4755 generic.go:334] "Generic (PLEG): container finished" podID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerID="fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9" exitCode=0 Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.820081 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerDied","Data":"fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9"} Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.826013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" event={"ID":"03684d66-3e86-4168-9a3a-62e40ba5ddce","Type":"ContainerDied","Data":"b19ab656e763731fd02ae17f8df9bf3d24e6ce5e1b1c63813e08e9b8f3736939"} Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.826066 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19ab656e763731fd02ae17f8df9bf3d24e6ce5e1b1c63813e08e9b8f3736939" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.826197 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.931767 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn"] Mar 17 01:35:51 crc kubenswrapper[4755]: E0317 01:35:51.932238 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03684d66-3e86-4168-9a3a-62e40ba5ddce" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.932257 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="03684d66-3e86-4168-9a3a-62e40ba5ddce" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.932492 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="03684d66-3e86-4168-9a3a-62e40ba5ddce" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.933219 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.934776 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.938251 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.939109 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.939859 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.945993 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.946620 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b8z6c" Mar 17 01:35:51 crc kubenswrapper[4755]: I0317 01:35:51.951789 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn"] Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.081767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.081930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.082029 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.082158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.082302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.082408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsc98\" (UniqueName: \"kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.184468 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.184630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.184726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsc98\" (UniqueName: \"kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.184832 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.184956 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.185025 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.856585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.858062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.859502 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsc98\" (UniqueName: \"kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.864199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.866019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:52 crc kubenswrapper[4755]: I0317 01:35:52.867109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-c5btn\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:53 crc kubenswrapper[4755]: I0317 01:35:53.150207 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:35:53 crc kubenswrapper[4755]: I0317 01:35:53.516414 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn"] Mar 17 01:35:53 crc kubenswrapper[4755]: W0317 01:35:53.517670 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc38b61_4933_487d_a05c_8ade6cd59270.slice/crio-9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751 WatchSource:0}: Error finding container 9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751: Status 404 returned error can't find the container with id 9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751 Mar 17 01:35:53 crc kubenswrapper[4755]: I0317 01:35:53.853347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerStarted","Data":"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee"} Mar 17 01:35:53 crc kubenswrapper[4755]: I0317 01:35:53.855996 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" event={"ID":"7dc38b61-4933-487d-a05c-8ade6cd59270","Type":"ContainerStarted","Data":"9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751"} Mar 17 01:35:55 crc kubenswrapper[4755]: I0317 01:35:55.885409 4755 generic.go:334] "Generic (PLEG): container finished" podID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerID="86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee" exitCode=0 Mar 17 01:35:55 crc kubenswrapper[4755]: I0317 01:35:55.885983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerDied","Data":"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee"} Mar 17 01:35:55 crc kubenswrapper[4755]: I0317 01:35:55.890689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" event={"ID":"7dc38b61-4933-487d-a05c-8ade6cd59270","Type":"ContainerStarted","Data":"793e3cb544bcaa9e5996fafedf375bb2985b3b4341b2862f8e91f99b6375f3fb"} Mar 17 01:35:55 crc kubenswrapper[4755]: I0317 01:35:55.959526 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" podStartSLOduration=4.392448307 podStartE2EDuration="4.959500789s" podCreationTimestamp="2026-03-17 01:35:51 +0000 UTC" firstStartedPulling="2026-03-17 01:35:53.521785732 +0000 UTC m=+4428.281238015" lastFinishedPulling="2026-03-17 01:35:54.088838204 +0000 UTC m=+4428.848290497" observedRunningTime="2026-03-17 01:35:55.93889095 +0000 UTC m=+4430.698343243" watchObservedRunningTime="2026-03-17 01:35:55.959500789 +0000 UTC m=+4430.718953122" Mar 17 01:35:56 crc kubenswrapper[4755]: I0317 01:35:56.903924 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerStarted","Data":"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e"} Mar 17 01:35:56 crc kubenswrapper[4755]: I0317 01:35:56.931034 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zzrw" podStartSLOduration=3.457017996 podStartE2EDuration="7.93101322s" podCreationTimestamp="2026-03-17 01:35:49 +0000 UTC" firstStartedPulling="2026-03-17 01:35:51.823397702 +0000 UTC m=+4426.582850025" lastFinishedPulling="2026-03-17 01:35:56.297392946 +0000 UTC m=+4431.056845249" observedRunningTime="2026-03-17 01:35:56.929413167 +0000 UTC m=+4431.688865490" watchObservedRunningTime="2026-03-17 01:35:56.93101322 +0000 UTC m=+4431.690465513" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.148138 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561856-5w7ms"] Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.150351 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.153984 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.155518 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.155627 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.177422 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-5w7ms"] Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.266862 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.266915 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.281830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxp8\" (UniqueName: \"kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8\") pod \"auto-csr-approver-29561856-5w7ms\" (UID: \"4f097fb2-0d82-4452-9e6a-46cd03d069bb\") " pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.360835 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.384199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhxp8\" (UniqueName: \"kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8\") pod \"auto-csr-approver-29561856-5w7ms\" (UID: \"4f097fb2-0d82-4452-9e6a-46cd03d069bb\") " pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.430997 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhxp8\" (UniqueName: \"kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8\") pod \"auto-csr-approver-29561856-5w7ms\" (UID: \"4f097fb2-0d82-4452-9e6a-46cd03d069bb\") " pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.480052 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:00 crc kubenswrapper[4755]: W0317 01:36:00.965418 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f097fb2_0d82_4452_9e6a_46cd03d069bb.slice/crio-f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06 WatchSource:0}: Error finding container f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06: Status 404 returned error can't find the container with id f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06 Mar 17 01:36:00 crc kubenswrapper[4755]: I0317 01:36:00.965752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-5w7ms"] Mar 17 01:36:01 crc kubenswrapper[4755]: I0317 01:36:01.965587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" event={"ID":"4f097fb2-0d82-4452-9e6a-46cd03d069bb","Type":"ContainerStarted","Data":"f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06"} Mar 17 01:36:02 crc kubenswrapper[4755]: I0317 01:36:02.984864 4755 generic.go:334] "Generic (PLEG): container finished" podID="4f097fb2-0d82-4452-9e6a-46cd03d069bb" containerID="1d6814c4d8fa6b73686c0468199839bd46390b5bd665417f155c9489484a4c96" exitCode=0 Mar 17 01:36:02 crc kubenswrapper[4755]: I0317 01:36:02.985099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" event={"ID":"4f097fb2-0d82-4452-9e6a-46cd03d069bb","Type":"ContainerDied","Data":"1d6814c4d8fa6b73686c0468199839bd46390b5bd665417f155c9489484a4c96"} Mar 17 01:36:04 crc kubenswrapper[4755]: I0317 01:36:04.980064 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.004782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" event={"ID":"4f097fb2-0d82-4452-9e6a-46cd03d069bb","Type":"ContainerDied","Data":"f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06"} Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.004818 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f341d88a546d058e85b7cdc5bbf347b442d81676f8769cfd2d1c164493a5ad06" Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.004892 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561856-5w7ms" Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.120608 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhxp8\" (UniqueName: \"kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8\") pod \"4f097fb2-0d82-4452-9e6a-46cd03d069bb\" (UID: \"4f097fb2-0d82-4452-9e6a-46cd03d069bb\") " Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.127827 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8" (OuterVolumeSpecName: "kube-api-access-nhxp8") pod "4f097fb2-0d82-4452-9e6a-46cd03d069bb" (UID: "4f097fb2-0d82-4452-9e6a-46cd03d069bb"). InnerVolumeSpecName "kube-api-access-nhxp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:05 crc kubenswrapper[4755]: I0317 01:36:05.223751 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhxp8\" (UniqueName: \"kubernetes.io/projected/4f097fb2-0d82-4452-9e6a-46cd03d069bb-kube-api-access-nhxp8\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:06 crc kubenswrapper[4755]: I0317 01:36:06.088112 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-hlrqm"] Mar 17 01:36:06 crc kubenswrapper[4755]: I0317 01:36:06.112532 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561850-hlrqm"] Mar 17 01:36:06 crc kubenswrapper[4755]: I0317 01:36:06.299281 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18e884a-6fc0-49f4-9824-8d6fec65fa9a" path="/var/lib/kubelet/pods/d18e884a-6fc0-49f4-9824-8d6fec65fa9a/volumes" Mar 17 01:36:09 crc kubenswrapper[4755]: I0317 01:36:09.070937 4755 generic.go:334] "Generic (PLEG): container finished" podID="7dc38b61-4933-487d-a05c-8ade6cd59270" containerID="793e3cb544bcaa9e5996fafedf375bb2985b3b4341b2862f8e91f99b6375f3fb" exitCode=0 Mar 17 01:36:09 crc kubenswrapper[4755]: I0317 01:36:09.071013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" event={"ID":"7dc38b61-4933-487d-a05c-8ade6cd59270","Type":"ContainerDied","Data":"793e3cb544bcaa9e5996fafedf375bb2985b3b4341b2862f8e91f99b6375f3fb"} Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.324908 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.404395 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.622903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.692820 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.693244 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.693501 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.693531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.693653 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.693724 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsc98\" (UniqueName: \"kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98\") pod \"7dc38b61-4933-487d-a05c-8ade6cd59270\" (UID: \"7dc38b61-4933-487d-a05c-8ade6cd59270\") " Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.701482 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98" (OuterVolumeSpecName: "kube-api-access-lsc98") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "kube-api-access-lsc98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.705413 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph" (OuterVolumeSpecName: "ceph") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.736732 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.738402 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.754604 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory" (OuterVolumeSpecName: "inventory") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.756620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "7dc38b61-4933-487d-a05c-8ade6cd59270" (UID: "7dc38b61-4933-487d-a05c-8ade6cd59270"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795925 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795955 4755 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795966 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsc98\" (UniqueName: \"kubernetes.io/projected/7dc38b61-4933-487d-a05c-8ade6cd59270-kube-api-access-lsc98\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795979 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795989 4755 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:10 crc kubenswrapper[4755]: I0317 01:36:10.795998 4755 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7dc38b61-4933-487d-a05c-8ade6cd59270-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.097601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" event={"ID":"7dc38b61-4933-487d-a05c-8ade6cd59270","Type":"ContainerDied","Data":"9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751"} Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.097685 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a2ec850a1f8149d538b73b66b778771d91a5291123f58bf0e2c3154d6c81751" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.097703 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zzrw" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="registry-server" containerID="cri-o://d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e" gracePeriod=2 Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.097828 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-c5btn" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.606465 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.715243 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities\") pod \"f950e094-8ad4-4304-93a2-745f5a924e5c\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.715303 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5bh\" (UniqueName: \"kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh\") pod \"f950e094-8ad4-4304-93a2-745f5a924e5c\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.715395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content\") pod \"f950e094-8ad4-4304-93a2-745f5a924e5c\" (UID: \"f950e094-8ad4-4304-93a2-745f5a924e5c\") " Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.716150 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities" (OuterVolumeSpecName: "utilities") pod "f950e094-8ad4-4304-93a2-745f5a924e5c" (UID: "f950e094-8ad4-4304-93a2-745f5a924e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.740739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh" (OuterVolumeSpecName: "kube-api-access-dj5bh") pod "f950e094-8ad4-4304-93a2-745f5a924e5c" (UID: "f950e094-8ad4-4304-93a2-745f5a924e5c"). InnerVolumeSpecName "kube-api-access-dj5bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.797707 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f950e094-8ad4-4304-93a2-745f5a924e5c" (UID: "f950e094-8ad4-4304-93a2-745f5a924e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.818188 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.818249 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f950e094-8ad4-4304-93a2-745f5a924e5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:11 crc kubenswrapper[4755]: I0317 01:36:11.818281 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5bh\" (UniqueName: \"kubernetes.io/projected/f950e094-8ad4-4304-93a2-745f5a924e5c-kube-api-access-dj5bh\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.114878 4755 generic.go:334] "Generic (PLEG): container finished" podID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerID="d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e" exitCode=0 Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.114953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerDied","Data":"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e"} Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.115005 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zzrw" event={"ID":"f950e094-8ad4-4304-93a2-745f5a924e5c","Type":"ContainerDied","Data":"61fdc53453901e601e1f6ab7bdb353421a15096f165a6459ce2e1cb51930bf6e"} Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.115047 4755 scope.go:117] "RemoveContainer" containerID="d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.115284 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zzrw" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.171217 4755 scope.go:117] "RemoveContainer" containerID="86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.183690 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.201639 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zzrw"] Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.211779 4755 scope.go:117] "RemoveContainer" containerID="fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.269685 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" path="/var/lib/kubelet/pods/f950e094-8ad4-4304-93a2-745f5a924e5c/volumes" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.273214 4755 scope.go:117] "RemoveContainer" containerID="d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e" Mar 17 01:36:12 crc kubenswrapper[4755]: E0317 01:36:12.273858 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e\": container with ID starting with d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e not found: ID does not exist" containerID="d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.273908 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e"} err="failed to get container status \"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e\": rpc error: code = NotFound desc = could not find container \"d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e\": container with ID starting with d47ec69e2f253b8f7e94ce1a91f286006c4fb7bd3826956040ad7d4cff188f8e not found: ID does not exist" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.273951 4755 scope.go:117] "RemoveContainer" containerID="86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee" Mar 17 01:36:12 crc kubenswrapper[4755]: E0317 01:36:12.280269 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee\": container with ID starting with 86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee not found: ID does not exist" containerID="86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.280310 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee"} err="failed to get container status \"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee\": rpc error: code = NotFound desc = could not find container \"86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee\": container with ID starting with 86b5f81c9d94da332c3415bde4c8793d37157f4827cb2c82db1e44b1fbf42fee not found: ID does not exist" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.280340 4755 scope.go:117] "RemoveContainer" containerID="fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9" Mar 17 01:36:12 crc kubenswrapper[4755]: E0317 01:36:12.280978 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9\": container with ID starting with fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9 not found: ID does not exist" containerID="fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9" Mar 17 01:36:12 crc kubenswrapper[4755]: I0317 01:36:12.281006 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9"} err="failed to get container status \"fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9\": rpc error: code = NotFound desc = could not find container \"fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9\": container with ID starting with fea390751438376b940f94de8ad042a4d8dd1362949a245ecf9c86f7ddd3fdc9 not found: ID does not exist" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.920778 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 17 01:36:27 crc kubenswrapper[4755]: E0317 01:36:27.921878 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f097fb2-0d82-4452-9e6a-46cd03d069bb" containerName="oc" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.921897 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f097fb2-0d82-4452-9e6a-46cd03d069bb" containerName="oc" Mar 17 01:36:27 crc kubenswrapper[4755]: E0317 01:36:27.921929 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc38b61-4933-487d-a05c-8ade6cd59270" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.921940 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc38b61-4933-487d-a05c-8ade6cd59270" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:36:27 crc kubenswrapper[4755]: E0317 01:36:27.921958 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="extract-content" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.921965 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="extract-content" Mar 17 01:36:27 crc kubenswrapper[4755]: E0317 01:36:27.921990 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="registry-server" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.921998 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="registry-server" Mar 17 01:36:27 crc kubenswrapper[4755]: E0317 01:36:27.922020 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="extract-utilities" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.922029 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="extract-utilities" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.922232 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f950e094-8ad4-4304-93a2-745f5a924e5c" containerName="registry-server" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.922250 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc38b61-4933-487d-a05c-8ade6cd59270" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.922271 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f097fb2-0d82-4452-9e6a-46cd03d069bb" containerName="oc" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.923668 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.927270 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.927808 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.937147 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.939001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.943763 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.953966 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 17 01:36:27 crc kubenswrapper[4755]: I0317 01:36:27.990145 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.023275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028745 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-ceph\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028879 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2msb\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-kube-api-access-t2msb\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.028930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030467 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030560 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030737 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-sys\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030933 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.030975 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-dev\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-run\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031595 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.031954 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxzm\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-kube-api-access-hkxzm\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-scripts\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032127 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032291 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-run\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.032425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136833 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-ceph\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136936 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2msb\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-kube-api-access-t2msb\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.136992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137179 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137198 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137286 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137382 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-sys\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-dev\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137573 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-run\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxzm\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-kube-api-access-hkxzm\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-scripts\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-run\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.137951 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.138019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.138161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.139101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.139244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-sys\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.139340 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.141580 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.142783 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.142849 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-dev\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.143716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-config-data\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.143975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.144078 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.144234 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.145183 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.145237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-run\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.145268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-run\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.145292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.145317 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4abc0e8b-235e-48c1-8066-8958aa05a2a3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.146151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-ceph\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.146679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/fd360dd3-b439-453e-8543-405c8d1804b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.146831 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd360dd3-b439-453e-8543-405c8d1804b5-scripts\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.147382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.148036 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.148120 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.158341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.161895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abc0e8b-235e-48c1-8066-8958aa05a2a3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.165109 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxzm\" (UniqueName: \"kubernetes.io/projected/4abc0e8b-235e-48c1-8066-8958aa05a2a3-kube-api-access-hkxzm\") pod \"cinder-volume-volume1-0\" (UID: \"4abc0e8b-235e-48c1-8066-8958aa05a2a3\") " pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.165373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2msb\" (UniqueName: \"kubernetes.io/projected/fd360dd3-b439-453e-8543-405c8d1804b5-kube-api-access-t2msb\") pod \"cinder-backup-0\" (UID: \"fd360dd3-b439-453e-8543-405c8d1804b5\") " pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.245355 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.260704 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.578254 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wd6gn"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.579802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.591770 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wd6gn"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.655161 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.655299 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xqk\" (UniqueName: \"kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.664887 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.664927 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.692662 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-faeb-account-create-update-45nfl"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.694146 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.703921 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.707057 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-faeb-account-create-update-45nfl"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.762917 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.765108 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.766607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.766907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xqk\" (UniqueName: \"kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.767986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.769154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.769421 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rjgsw" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.770964 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.771123 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.780876 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.785920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.789302 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bcjxj" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.789535 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.790814 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.792889 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xqk\" (UniqueName: \"kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk\") pod \"manila-db-create-wd6gn\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.794475 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.808616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.841536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868940 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.868989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rnz\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869023 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffk6\" (UniqueName: \"kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869062 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxcq\" (UniqueName: \"kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869211 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.869233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.883333 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.885572 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.891763 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.892047 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.896154 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.896989 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:28 crc kubenswrapper[4755]: E0317 01:36:28.898188 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-54rnz logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.959200 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972490 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972559 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rnz\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972655 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffk6\" (UniqueName: \"kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxcq\" (UniqueName: \"kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972782 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972849 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972886 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.972999 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.973022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.973047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.973063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.973086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.973112 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.974054 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.974311 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.977004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.977905 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.981776 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.982629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.983892 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.985101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.985663 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.988013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.991216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:28 crc kubenswrapper[4755]: I0317 01:36:28.999640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.000276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.019129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rnz\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.019701 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxcq\" (UniqueName: \"kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq\") pod \"manila-faeb-account-create-update-45nfl\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.036745 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.038218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffk6\" (UniqueName: \"kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6\") pod \"horizon-587c6f496f-f97jz\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.054631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.076240 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.076278 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.076490 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.076816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.077269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.077357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.077413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.077454 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.077479 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.078035 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.082176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.085004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.091632 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.094404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.102244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.103369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.107686 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.112815 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.119301 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:29 crc kubenswrapper[4755]: E0317 01:36:29.120197 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="4e60b73b-06f4-41b1-9fd7-5b137253b7d3" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.130414 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.132713 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.133787 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.158868 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.171373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.182207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.184093 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.184153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.184205 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.184280 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6987\" (UniqueName: \"kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.184357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.289890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.290295 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.290346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.290414 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.290546 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6987\" (UniqueName: \"kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.292308 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.292946 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.293637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.298472 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.320505 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6987\" (UniqueName: \"kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987\") pod \"horizon-656df98fd5-c4xrz\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.370782 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.371375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4abc0e8b-235e-48c1-8066-8958aa05a2a3","Type":"ContainerStarted","Data":"a843c32c213748c6b0ed478462eaa6ecbc6ad0531afb75526f497e5f4a509968"} Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.371546 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.526894 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.634755 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wd6gn"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.751938 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-faeb-account-create-update-45nfl"] Mar 17 01:36:29 crc kubenswrapper[4755]: I0317 01:36:29.780609 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:36:29 crc kubenswrapper[4755]: W0317 01:36:29.864092 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b74f5d_f565_4d19_b8c5_3f77e5b4eaa9.slice/crio-767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d WatchSource:0}: Error finding container 767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d: Status 404 returned error can't find the container with id 767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d Mar 17 01:36:29 crc kubenswrapper[4755]: W0317 01:36:29.872090 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7388243_204f_4d8a_b842_2529524f0568.slice/crio-33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b WatchSource:0}: Error finding container 33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b: Status 404 returned error can't find the container with id 33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.081050 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.107573 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.107677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.107727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108003 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108063 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108094 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.108324 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data\") pod \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\" (UID: \"4e60b73b-06f4-41b1-9fd7-5b137253b7d3\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.109277 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.110692 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs" (OuterVolumeSpecName: "logs") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.111240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.112939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.112982 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.123920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data" (OuterVolumeSpecName: "config-data") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.123968 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph" (OuterVolumeSpecName: "ceph") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.125403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts" (OuterVolumeSpecName: "scripts") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.126352 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv" (OuterVolumeSpecName: "kube-api-access-hszpv") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "kube-api-access-hszpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.127580 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.127869 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4e60b73b-06f4-41b1-9fd7-5b137253b7d3" (UID: "4e60b73b-06f4-41b1-9fd7-5b137253b7d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210345 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210399 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210459 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rnz\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210506 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210553 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210572 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.210660 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data\") pod \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\" (UID: \"fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0\") " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211207 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211227 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211238 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211248 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-kube-api-access-hszpv\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211257 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211265 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211273 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211280 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211288 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60b73b-06f4-41b1-9fd7-5b137253b7d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.211935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs" (OuterVolumeSpecName: "logs") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.213802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.219138 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.219176 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph" (OuterVolumeSpecName: "ceph") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.219223 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts" (OuterVolumeSpecName: "scripts") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.232086 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.232117 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.232174 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data" (OuterVolumeSpecName: "config-data") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.232274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz" (OuterVolumeSpecName: "kube-api-access-54rnz") pod "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" (UID: "fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0"). InnerVolumeSpecName "kube-api-access-54rnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.245932 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313762 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313790 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313801 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313809 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313830 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313841 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rnz\" (UniqueName: \"kubernetes.io/projected/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-kube-api-access-54rnz\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313850 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313859 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313869 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.313876 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.337205 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.390880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerStarted","Data":"bcc9ce421f395834bc05391d104b9d40ca7221c0f53095cf9d3bf00c9c4e8464"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.398034 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd6gn" event={"ID":"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9","Type":"ContainerStarted","Data":"0b7986ed18dc2e729618ad4ce64178a3cdc6195ef52f6981d2a3f708c8f2f30d"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.398071 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd6gn" event={"ID":"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9","Type":"ContainerStarted","Data":"767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.400610 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-faeb-account-create-update-45nfl" event={"ID":"a7388243-204f-4d8a-b842-2529524f0568","Type":"ContainerStarted","Data":"7e470ef9b637bb108c7817113529c16ebddba08426e47e1a06e43335b2112e45"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.400635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-faeb-account-create-update-45nfl" event={"ID":"a7388243-204f-4d8a-b842-2529524f0568","Type":"ContainerStarted","Data":"33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.405803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"fd360dd3-b439-453e-8543-405c8d1804b5","Type":"ContainerStarted","Data":"0877979297bb5deb4af0a8d2572c553c0e85443fe28864249f4c5d065211e39c"} Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.405844 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.405892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.417658 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.445753 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-wd6gn" podStartSLOduration=2.445724533 podStartE2EDuration="2.445724533s" podCreationTimestamp="2026-03-17 01:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:30.417144229 +0000 UTC m=+4465.176596512" watchObservedRunningTime="2026-03-17 01:36:30.445724533 +0000 UTC m=+4465.205176816" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.514159 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-faeb-account-create-update-45nfl" podStartSLOduration=2.514138815 podStartE2EDuration="2.514138815s" podCreationTimestamp="2026-03-17 01:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:30.455880928 +0000 UTC m=+4465.215333211" watchObservedRunningTime="2026-03-17 01:36:30.514138815 +0000 UTC m=+4465.273591098" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.579515 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.602922 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.625050 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.629042 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.631290 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rjgsw" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.631375 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.631540 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.632184 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.729514 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.730971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731220 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731310 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731472 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrl92\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731583 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731662 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.731731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.750774 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.783643 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.804810 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.810632 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.827494 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.833855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.833919 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.833941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrl92\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834120 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.834473 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.835392 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.836221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.836273 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.838127 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.838276 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.847327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.848001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.909504 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.916671 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.917476 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.929956 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.940485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrl92\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953359 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953421 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953670 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953890 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.953933 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.954029 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwxp\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.954106 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:30 crc kubenswrapper[4755]: I0317 01:36:30.986641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.055960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwxp\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056195 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056228 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056271 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.056843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.057005 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.060295 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.063282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.063408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.063617 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.093363 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.093816 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.101757 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwxp\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.166877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.258812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.285626 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.354430 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.407653 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.423751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.426854 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.439287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.485526 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.506998 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.509839 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"fd360dd3-b439-453e-8543-405c8d1804b5","Type":"ContainerStarted","Data":"8975446ee7dce36ce840eefabf6a82e059964de67ea1af9505e14dc2d0b86255"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.514320 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4abc0e8b-235e-48c1-8066-8958aa05a2a3","Type":"ContainerStarted","Data":"4b34d05f00d9f98947de27c9f6d4dd0bab4bf83fd765f6a3d95ded5812eae937"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.514354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4abc0e8b-235e-48c1-8066-8958aa05a2a3","Type":"ContainerStarted","Data":"670f437b63c91f552a993d09273f30ed2435492ee7a3f9fd04d7e9d7118f0a34"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.521779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerStarted","Data":"3f720d9061a778cbfe2f9d387eeec900fffe8823647c44394ce30cc3e3ad70b2"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.523545 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54d5b659cb-h7mw4"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.526819 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.536251 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d5b659cb-h7mw4"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.542109 4755 generic.go:334] "Generic (PLEG): container finished" podID="34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" containerID="0b7986ed18dc2e729618ad4ce64178a3cdc6195ef52f6981d2a3f708c8f2f30d" exitCode=0 Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.542163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd6gn" event={"ID":"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9","Type":"ContainerDied","Data":"0b7986ed18dc2e729618ad4ce64178a3cdc6195ef52f6981d2a3f708c8f2f30d"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.552774 4755 generic.go:334] "Generic (PLEG): container finished" podID="a7388243-204f-4d8a-b842-2529524f0568" containerID="7e470ef9b637bb108c7817113529c16ebddba08426e47e1a06e43335b2112e45" exitCode=0 Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.552818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-faeb-account-create-update-45nfl" event={"ID":"a7388243-204f-4d8a-b842-2529524f0568","Type":"ContainerDied","Data":"7e470ef9b637bb108c7817113529c16ebddba08426e47e1a06e43335b2112e45"} Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580550 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchxx\" (UniqueName: \"kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580772 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.580849 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.590256 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.594237 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.417955792 podStartE2EDuration="4.594220947s" podCreationTimestamp="2026-03-17 01:36:27 +0000 UTC" firstStartedPulling="2026-03-17 01:36:29.075400394 +0000 UTC m=+4463.834852677" lastFinishedPulling="2026-03-17 01:36:30.251665549 +0000 UTC m=+4465.011117832" observedRunningTime="2026-03-17 01:36:31.534958541 +0000 UTC m=+4466.294410824" watchObservedRunningTime="2026-03-17 01:36:31.594220947 +0000 UTC m=+4466.353673230" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684101 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kchxx\" (UniqueName: \"kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-secret-key\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-scripts\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684255 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18055ce1-2e32-41f8-8985-75bda9d75b01-logs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-tls-certs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684297 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2b4\" (UniqueName: \"kubernetes.io/projected/18055ce1-2e32-41f8-8985-75bda9d75b01-kube-api-access-7v2b4\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684324 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-combined-ca-bundle\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684373 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684421 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684451 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684498 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-config-data\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684531 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.684547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.687765 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.687990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.692797 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-secret-key\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786827 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-scripts\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18055ce1-2e32-41f8-8985-75bda9d75b01-logs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-tls-certs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2b4\" (UniqueName: \"kubernetes.io/projected/18055ce1-2e32-41f8-8985-75bda9d75b01-kube-api-access-7v2b4\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.786912 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-combined-ca-bundle\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.787251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18055ce1-2e32-41f8-8985-75bda9d75b01-logs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.787696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-config-data\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.789078 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-scripts\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.789225 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18055ce1-2e32-41f8-8985-75bda9d75b01-config-data\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.871433 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.890418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.892883 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchxx\" (UniqueName: \"kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.894947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-combined-ca-bundle\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.896084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-tls-certs\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.896086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs\") pod \"horizon-7f6c96f776-ndhg5\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.896928 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2b4\" (UniqueName: \"kubernetes.io/projected/18055ce1-2e32-41f8-8985-75bda9d75b01-kube-api-access-7v2b4\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:31 crc kubenswrapper[4755]: I0317 01:36:31.909421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18055ce1-2e32-41f8-8985-75bda9d75b01-horizon-secret-key\") pod \"horizon-54d5b659cb-h7mw4\" (UID: \"18055ce1-2e32-41f8-8985-75bda9d75b01\") " pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.092829 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.103987 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.155381 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.289622 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e60b73b-06f4-41b1-9fd7-5b137253b7d3" path="/var/lib/kubelet/pods/4e60b73b-06f4-41b1-9fd7-5b137253b7d3/volumes" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.290518 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0" path="/var/lib/kubelet/pods/fbbdf81f-6cd3-4956-ba3d-346e1e2b64d0/volumes" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.572102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"fd360dd3-b439-453e-8543-405c8d1804b5","Type":"ContainerStarted","Data":"5b1b4899a12dc2edc883a3ded8e948f83c737fcba7e7791102850ba76d0140c6"} Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.575789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerStarted","Data":"afb82b8931b80628cb60fced902322051eb2a395b2e62c2bf39cae7e6a663e1e"} Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.600413 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.69706704 podStartE2EDuration="5.600393766s" podCreationTimestamp="2026-03-17 01:36:27 +0000 UTC" firstStartedPulling="2026-03-17 01:36:29.862300788 +0000 UTC m=+4464.621753071" lastFinishedPulling="2026-03-17 01:36:30.765627524 +0000 UTC m=+4465.525079797" observedRunningTime="2026-03-17 01:36:32.593276594 +0000 UTC m=+4467.352728897" watchObservedRunningTime="2026-03-17 01:36:32.600393766 +0000 UTC m=+4467.359846039" Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.664379 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:36:32 crc kubenswrapper[4755]: W0317 01:36:32.704764 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae88496_2d38_4e87_bf99_c371e4af8c35.slice/crio-9b1e77e4948524b68dab667f5bf20f948bb0969db26ab9a367c7697dc35dce44 WatchSource:0}: Error finding container 9b1e77e4948524b68dab667f5bf20f948bb0969db26ab9a367c7697dc35dce44: Status 404 returned error can't find the container with id 9b1e77e4948524b68dab667f5bf20f948bb0969db26ab9a367c7697dc35dce44 Mar 17 01:36:32 crc kubenswrapper[4755]: I0317 01:36:32.799820 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d5b659cb-h7mw4"] Mar 17 01:36:32 crc kubenswrapper[4755]: W0317 01:36:32.804258 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18055ce1_2e32_41f8_8985_75bda9d75b01.slice/crio-ab43244af810a78e90892b0bdccb2834353c0d7085327b1abd164737cc729054 WatchSource:0}: Error finding container ab43244af810a78e90892b0bdccb2834353c0d7085327b1abd164737cc729054: Status 404 returned error can't find the container with id ab43244af810a78e90892b0bdccb2834353c0d7085327b1abd164737cc729054 Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.010061 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:33 crc kubenswrapper[4755]: W0317 01:36:33.085332 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa45f87_af3d_41e9_9aca_3e2ae1522178.slice/crio-e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b WatchSource:0}: Error finding container e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b: Status 404 returned error can't find the container with id e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.095155 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.240871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts\") pod \"a7388243-204f-4d8a-b842-2529524f0568\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.240965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxcq\" (UniqueName: \"kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq\") pod \"a7388243-204f-4d8a-b842-2529524f0568\" (UID: \"a7388243-204f-4d8a-b842-2529524f0568\") " Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.242866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7388243-204f-4d8a-b842-2529524f0568" (UID: "a7388243-204f-4d8a-b842-2529524f0568"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.247148 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.250230 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq" (OuterVolumeSpecName: "kube-api-access-dgxcq") pod "a7388243-204f-4d8a-b842-2529524f0568" (UID: "a7388243-204f-4d8a-b842-2529524f0568"). InnerVolumeSpecName "kube-api-access-dgxcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.261395 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.344390 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxcq\" (UniqueName: \"kubernetes.io/projected/a7388243-204f-4d8a-b842-2529524f0568-kube-api-access-dgxcq\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.344625 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7388243-204f-4d8a-b842-2529524f0568-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.346160 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.445403 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts\") pod \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.445550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xqk\" (UniqueName: \"kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk\") pod \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\" (UID: \"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9\") " Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.447029 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" (UID: "34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.472944 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk" (OuterVolumeSpecName: "kube-api-access-j2xqk") pod "34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" (UID: "34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9"). InnerVolumeSpecName "kube-api-access-j2xqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.550940 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.550970 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xqk\" (UniqueName: \"kubernetes.io/projected/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9-kube-api-access-j2xqk\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.597665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerStarted","Data":"6ef91885ce8e3c58998983a0d2f7dee14b2f3dab1925af53838813405b26fca8"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.602851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerStarted","Data":"9b1e77e4948524b68dab667f5bf20f948bb0969db26ab9a367c7697dc35dce44"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.604343 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d5b659cb-h7mw4" event={"ID":"18055ce1-2e32-41f8-8985-75bda9d75b01","Type":"ContainerStarted","Data":"ab43244af810a78e90892b0bdccb2834353c0d7085327b1abd164737cc729054"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.606231 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerStarted","Data":"e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.612916 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd6gn" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.612940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd6gn" event={"ID":"34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9","Type":"ContainerDied","Data":"767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.613551 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767b4343944d9ae51cfa96c1f5ca3e97797e9f8bf16c96eebb8d0d630874611d" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.624790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-faeb-account-create-update-45nfl" event={"ID":"a7388243-204f-4d8a-b842-2529524f0568","Type":"ContainerDied","Data":"33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b"} Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.624832 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33209d2a56b47131344927fc5dff5249463e94b71709488fc66e6c88e166ce0b" Mar 17 01:36:33 crc kubenswrapper[4755]: I0317 01:36:33.624891 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-faeb-account-create-update-45nfl" Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.640431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerStarted","Data":"93feb718bc2e208adff5beef43647bd2ea7467fbf7df708c8a41691fda5fa6ff"} Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.640966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerStarted","Data":"b74ccbcfe4adc39d55afad1bcb90a8c76c16fefdf742c912be5a554ae56ccfc3"} Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.640883 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-log" containerID="cri-o://b74ccbcfe4adc39d55afad1bcb90a8c76c16fefdf742c912be5a554ae56ccfc3" gracePeriod=30 Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.640659 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-httpd" containerID="cri-o://93feb718bc2e208adff5beef43647bd2ea7467fbf7df708c8a41691fda5fa6ff" gracePeriod=30 Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.646636 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-log" containerID="cri-o://6ef91885ce8e3c58998983a0d2f7dee14b2f3dab1925af53838813405b26fca8" gracePeriod=30 Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.646738 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-httpd" containerID="cri-o://e9eb5669fe6aa92db155b1d77b106a339c80dd3ba53da75951582b3d554c3b3e" gracePeriod=30 Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.646798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerStarted","Data":"e9eb5669fe6aa92db155b1d77b106a339c80dd3ba53da75951582b3d554c3b3e"} Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.668352 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.668331191 podStartE2EDuration="4.668331191s" podCreationTimestamp="2026-03-17 01:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:34.66568208 +0000 UTC m=+4469.425134383" watchObservedRunningTime="2026-03-17 01:36:34.668331191 +0000 UTC m=+4469.427783464" Mar 17 01:36:34 crc kubenswrapper[4755]: I0317 01:36:34.704321 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.7043002860000005 podStartE2EDuration="4.704300286s" podCreationTimestamp="2026-03-17 01:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:34.690561073 +0000 UTC m=+4469.450013356" watchObservedRunningTime="2026-03-17 01:36:34.704300286 +0000 UTC m=+4469.463752569" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673330 4755 generic.go:334] "Generic (PLEG): container finished" podID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerID="93feb718bc2e208adff5beef43647bd2ea7467fbf7df708c8a41691fda5fa6ff" exitCode=143 Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673741 4755 generic.go:334] "Generic (PLEG): container finished" podID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerID="b74ccbcfe4adc39d55afad1bcb90a8c76c16fefdf742c912be5a554ae56ccfc3" exitCode=143 Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerDied","Data":"93feb718bc2e208adff5beef43647bd2ea7467fbf7df708c8a41691fda5fa6ff"} Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673904 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerDied","Data":"b74ccbcfe4adc39d55afad1bcb90a8c76c16fefdf742c912be5a554ae56ccfc3"} Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673931 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1fa45f87-af3d-41e9-9aca-3e2ae1522178","Type":"ContainerDied","Data":"e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b"} Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.673949 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e21e1173580584b7ce6a42688dbe22aff9f4d2e8089a17dfe378b63c828c4d0b" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.686310 4755 generic.go:334] "Generic (PLEG): container finished" podID="09223cc8-9989-410a-a05a-84772dc058ad" containerID="e9eb5669fe6aa92db155b1d77b106a339c80dd3ba53da75951582b3d554c3b3e" exitCode=0 Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.686347 4755 generic.go:334] "Generic (PLEG): container finished" podID="09223cc8-9989-410a-a05a-84772dc058ad" containerID="6ef91885ce8e3c58998983a0d2f7dee14b2f3dab1925af53838813405b26fca8" exitCode=143 Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.686390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerDied","Data":"e9eb5669fe6aa92db155b1d77b106a339c80dd3ba53da75951582b3d554c3b3e"} Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.686417 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerDied","Data":"6ef91885ce8e3c58998983a0d2f7dee14b2f3dab1925af53838813405b26fca8"} Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.687233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.734346 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735055 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735111 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735193 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735238 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735279 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dwxp\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp\") pod \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\" (UID: \"1fa45f87-af3d-41e9-9aca-3e2ae1522178\") " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.735759 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.737259 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs" (OuterVolumeSpecName: "logs") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.785793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.786415 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts" (OuterVolumeSpecName: "scripts") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.786494 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph" (OuterVolumeSpecName: "ceph") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.786593 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp" (OuterVolumeSpecName: "kube-api-access-6dwxp") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "kube-api-access-6dwxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838809 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838835 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838844 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dwxp\" (UniqueName: \"kubernetes.io/projected/1fa45f87-af3d-41e9-9aca-3e2ae1522178-kube-api-access-6dwxp\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838853 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838879 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.838890 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1fa45f87-af3d-41e9-9aca-3e2ae1522178-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.854622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.880455 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.880903 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.881708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data" (OuterVolumeSpecName: "config-data") pod "1fa45f87-af3d-41e9-9aca-3e2ae1522178" (UID: "1fa45f87-af3d-41e9-9aca-3e2ae1522178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.942196 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.942235 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.942247 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa45f87-af3d-41e9-9aca-3e2ae1522178-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:35 crc kubenswrapper[4755]: I0317 01:36:35.942256 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.702304 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.757195 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.781515 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.792497 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:36 crc kubenswrapper[4755]: E0317 01:36:36.793002 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7388243-204f-4d8a-b842-2529524f0568" containerName="mariadb-account-create-update" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793022 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7388243-204f-4d8a-b842-2529524f0568" containerName="mariadb-account-create-update" Mar 17 01:36:36 crc kubenswrapper[4755]: E0317 01:36:36.793041 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" containerName="mariadb-database-create" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793048 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" containerName="mariadb-database-create" Mar 17 01:36:36 crc kubenswrapper[4755]: E0317 01:36:36.793076 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-httpd" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-httpd" Mar 17 01:36:36 crc kubenswrapper[4755]: E0317 01:36:36.793094 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-log" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793100 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-log" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793314 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-httpd" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793328 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" containerName="mariadb-database-create" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793339 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7388243-204f-4d8a-b842-2529524f0568" containerName="mariadb-account-create-update" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.793345 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" containerName="glance-log" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.794583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.797930 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.798885 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.806654 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967566 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967644 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kchm\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-kube-api-access-5kchm\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967674 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-logs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967738 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:36 crc kubenswrapper[4755]: I0317 01:36:36.967887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-logs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069744 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069857 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.069996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.070037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kchm\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-kube-api-access-5kchm\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.070063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.070316 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-logs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.070490 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.071009 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3526ee99-7b67-44b5-8cc1-0d8731e68758-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.077085 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.077416 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.077581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.084926 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kchm\" (UniqueName: \"kubernetes.io/projected/3526ee99-7b67-44b5-8cc1-0d8731e68758-kube-api-access-5kchm\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.085742 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.092005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3526ee99-7b67-44b5-8cc1-0d8731e68758-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.132710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3526ee99-7b67-44b5-8cc1-0d8731e68758\") " pod="openstack/glance-default-internal-api-0" Mar 17 01:36:37 crc kubenswrapper[4755]: I0317 01:36:37.420544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:38 crc kubenswrapper[4755]: I0317 01:36:38.261300 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa45f87-af3d-41e9-9aca-3e2ae1522178" path="/var/lib/kubelet/pods/1fa45f87-af3d-41e9-9aca-3e2ae1522178/volumes" Mar 17 01:36:38 crc kubenswrapper[4755]: I0317 01:36:38.457915 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 17 01:36:38 crc kubenswrapper[4755]: I0317 01:36:38.490054 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.024830 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-dtxkj"] Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.026715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.029238 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.029334 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-twjcr" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.044940 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-dtxkj"] Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.118668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sx69\" (UniqueName: \"kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.118743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.118816 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.118843 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.221223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sx69\" (UniqueName: \"kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.221378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.221527 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.221584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.226408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.227904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.228687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.239606 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sx69\" (UniqueName: \"kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69\") pod \"manila-db-sync-dtxkj\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:39 crc kubenswrapper[4755]: I0317 01:36:39.348290 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dtxkj" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.540348 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689282 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689476 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689503 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689542 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689654 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689782 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrl92\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.689814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle\") pod \"09223cc8-9989-410a-a05a-84772dc058ad\" (UID: \"09223cc8-9989-410a-a05a-84772dc058ad\") " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.690899 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.690946 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs" (OuterVolumeSpecName: "logs") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.697282 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts" (OuterVolumeSpecName: "scripts") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.697313 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92" (OuterVolumeSpecName: "kube-api-access-mrl92") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "kube-api-access-mrl92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.719186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.725941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph" (OuterVolumeSpecName: "ceph") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.735503 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.758022 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09223cc8-9989-410a-a05a-84772dc058ad","Type":"ContainerDied","Data":"afb82b8931b80628cb60fced902322051eb2a395b2e62c2bf39cae7e6a663e1e"} Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.758069 4755 scope.go:117] "RemoveContainer" containerID="e9eb5669fe6aa92db155b1d77b106a339c80dd3ba53da75951582b3d554c3b3e" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.758198 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.788875 4755 scope.go:117] "RemoveContainer" containerID="6ef91885ce8e3c58998983a0d2f7dee14b2f3dab1925af53838813405b26fca8" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.792969 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.792990 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrl92\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-kube-api-access-mrl92\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.793005 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.793017 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/09223cc8-9989-410a-a05a-84772dc058ad-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.793029 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.793042 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09223cc8-9989-410a-a05a-84772dc058ad-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.793074 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.863658 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.895062 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.895425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.906421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data" (OuterVolumeSpecName: "config-data") pod "09223cc8-9989-410a-a05a-84772dc058ad" (UID: "09223cc8-9989-410a-a05a-84772dc058ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.963210 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-dtxkj"] Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.997504 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:41 crc kubenswrapper[4755]: I0317 01:36:41.997541 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09223cc8-9989-410a-a05a-84772dc058ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.091545 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.122181 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.144722 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.159733 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:42 crc kubenswrapper[4755]: E0317 01:36:42.160515 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-httpd" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.160547 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-httpd" Mar 17 01:36:42 crc kubenswrapper[4755]: E0317 01:36:42.160600 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-log" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.160612 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-log" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.160985 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-log" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.161019 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="09223cc8-9989-410a-a05a-84772dc058ad" containerName="glance-httpd" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.162754 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.167074 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.167280 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.170271 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.294126 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09223cc8-9989-410a-a05a-84772dc058ad" path="/var/lib/kubelet/pods/09223cc8-9989-410a-a05a-84772dc058ad/volumes" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.301935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.301990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-logs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302023 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302083 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302175 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302274 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.302294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lksx\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-kube-api-access-6lksx\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.403850 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lksx\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-kube-api-access-6lksx\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404306 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-logs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404337 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404380 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.404888 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.405726 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.405976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3953ffe-b583-483a-b3e4-8cb6393b09f7-logs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.415734 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.417157 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.417971 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.418932 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.424408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lksx\" (UniqueName: \"kubernetes.io/projected/d3953ffe-b583-483a-b3e4-8cb6393b09f7-kube-api-access-6lksx\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.453570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3953ffe-b583-483a-b3e4-8cb6393b09f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.455976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d3953ffe-b583-483a-b3e4-8cb6393b09f7\") " pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.489788 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.774606 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3526ee99-7b67-44b5-8cc1-0d8731e68758","Type":"ContainerStarted","Data":"c81c7c20eb8932149f1c7385d6ac77e1ee8135f98a9a5df4bbe7b3e858080388"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.786017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerStarted","Data":"8e2429255ffe8d98eaa405e5ff5c7a007e00d8a35c002653e45b7da036a09622"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.786084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerStarted","Data":"f670af933b31158e9096ecca49bd596686ce2a1c9788be58ac319e50c1603f02"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.786321 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-656df98fd5-c4xrz" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon-log" containerID="cri-o://f670af933b31158e9096ecca49bd596686ce2a1c9788be58ac319e50c1603f02" gracePeriod=30 Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.786323 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-656df98fd5-c4xrz" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon" containerID="cri-o://8e2429255ffe8d98eaa405e5ff5c7a007e00d8a35c002653e45b7da036a09622" gracePeriod=30 Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.789228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d5b659cb-h7mw4" event={"ID":"18055ce1-2e32-41f8-8985-75bda9d75b01","Type":"ContainerStarted","Data":"5bd4f816cf2d68ed517071468f6b8524d176da3e9ca39c87cda6f5dddea08b5e"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.789287 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d5b659cb-h7mw4" event={"ID":"18055ce1-2e32-41f8-8985-75bda9d75b01","Type":"ContainerStarted","Data":"f75100da021194048aa2877e7453f5f9dd11ee8e8c4a4f8c0f88160821607ff8"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.800791 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerStarted","Data":"f7567997308408dbb2139de2595e00ba6ae6735c9cda2535a531582128a9487a"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.800834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerStarted","Data":"4a701b4e9a067233672effaac585e80e05c9b1c0d3dd8d44dda4a23be5c25fa0"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.804099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dtxkj" event={"ID":"7bb170d1-6e83-49fc-925f-6020490e5da7","Type":"ContainerStarted","Data":"922b7c25a197a46c9645be12947e40555848be1b1deae41f89e4c0bc4e8f2bf0"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.806657 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerStarted","Data":"8c565b99dfa52270938fcbab36b28ed8c634145d0d1a43d1250f3059aaf578f3"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.806704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerStarted","Data":"f8b84d883c2a5eea4285f7ebf29e74441a694c34589be269db2058965460d972"} Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.806825 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587c6f496f-f97jz" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon-log" containerID="cri-o://f8b84d883c2a5eea4285f7ebf29e74441a694c34589be269db2058965460d972" gracePeriod=30 Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.806989 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587c6f496f-f97jz" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon" containerID="cri-o://8c565b99dfa52270938fcbab36b28ed8c634145d0d1a43d1250f3059aaf578f3" gracePeriod=30 Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.822085 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-656df98fd5-c4xrz" podStartSLOduration=4.1862560349999995 podStartE2EDuration="14.822066268s" podCreationTimestamp="2026-03-17 01:36:28 +0000 UTC" firstStartedPulling="2026-03-17 01:36:30.872859707 +0000 UTC m=+4465.632311980" lastFinishedPulling="2026-03-17 01:36:41.50866991 +0000 UTC m=+4476.268122213" observedRunningTime="2026-03-17 01:36:42.815053788 +0000 UTC m=+4477.574506071" watchObservedRunningTime="2026-03-17 01:36:42.822066268 +0000 UTC m=+4477.581518551" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.867747 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-587c6f496f-f97jz" podStartSLOduration=3.234881936 podStartE2EDuration="14.867721593s" podCreationTimestamp="2026-03-17 01:36:28 +0000 UTC" firstStartedPulling="2026-03-17 01:36:29.875841024 +0000 UTC m=+4464.635293307" lastFinishedPulling="2026-03-17 01:36:41.508680681 +0000 UTC m=+4476.268132964" observedRunningTime="2026-03-17 01:36:42.840839476 +0000 UTC m=+4477.600291759" watchObservedRunningTime="2026-03-17 01:36:42.867721593 +0000 UTC m=+4477.627173886" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.888911 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54d5b659cb-h7mw4" podStartSLOduration=3.190660349 podStartE2EDuration="11.888889147s" podCreationTimestamp="2026-03-17 01:36:31 +0000 UTC" firstStartedPulling="2026-03-17 01:36:32.810405912 +0000 UTC m=+4467.569858195" lastFinishedPulling="2026-03-17 01:36:41.50863471 +0000 UTC m=+4476.268086993" observedRunningTime="2026-03-17 01:36:42.861992409 +0000 UTC m=+4477.621444702" watchObservedRunningTime="2026-03-17 01:36:42.888889147 +0000 UTC m=+4477.648341430" Mar 17 01:36:42 crc kubenswrapper[4755]: I0317 01:36:42.901804 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f6c96f776-ndhg5" podStartSLOduration=3.075610534 podStartE2EDuration="11.901784646s" podCreationTimestamp="2026-03-17 01:36:31 +0000 UTC" firstStartedPulling="2026-03-17 01:36:32.711141004 +0000 UTC m=+4467.470593277" lastFinishedPulling="2026-03-17 01:36:41.537315106 +0000 UTC m=+4476.296767389" observedRunningTime="2026-03-17 01:36:42.88162008 +0000 UTC m=+4477.641072363" watchObservedRunningTime="2026-03-17 01:36:42.901784646 +0000 UTC m=+4477.661236919" Mar 17 01:36:43 crc kubenswrapper[4755]: I0317 01:36:43.050764 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 01:36:43 crc kubenswrapper[4755]: W0317 01:36:43.061552 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3953ffe_b583_483a_b3e4_8cb6393b09f7.slice/crio-8c56d60b5c57dc04f465a97ce721af6ce1005542d6c19cf3837d9fdf61cb7c31 WatchSource:0}: Error finding container 8c56d60b5c57dc04f465a97ce721af6ce1005542d6c19cf3837d9fdf61cb7c31: Status 404 returned error can't find the container with id 8c56d60b5c57dc04f465a97ce721af6ce1005542d6c19cf3837d9fdf61cb7c31 Mar 17 01:36:43 crc kubenswrapper[4755]: I0317 01:36:43.821865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3526ee99-7b67-44b5-8cc1-0d8731e68758","Type":"ContainerStarted","Data":"89848f1da8ba6230384446305fbeadc90c9b03ca3e42d263263ccefc53599167"} Mar 17 01:36:43 crc kubenswrapper[4755]: I0317 01:36:43.827451 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3953ffe-b583-483a-b3e4-8cb6393b09f7","Type":"ContainerStarted","Data":"8c56d60b5c57dc04f465a97ce721af6ce1005542d6c19cf3837d9fdf61cb7c31"} Mar 17 01:36:44 crc kubenswrapper[4755]: I0317 01:36:44.840612 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3526ee99-7b67-44b5-8cc1-0d8731e68758","Type":"ContainerStarted","Data":"6b9b91331f2f67899269579231e17acf5658163188ca87ec3ea11dff28138041"} Mar 17 01:36:44 crc kubenswrapper[4755]: I0317 01:36:44.842829 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3953ffe-b583-483a-b3e4-8cb6393b09f7","Type":"ContainerStarted","Data":"bb3dadd200b16e4c8ff410ea113d04b51062e7fe233d81249156a43865692f29"} Mar 17 01:36:44 crc kubenswrapper[4755]: I0317 01:36:44.842888 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3953ffe-b583-483a-b3e4-8cb6393b09f7","Type":"ContainerStarted","Data":"f5e99fb17a79ff0d97a581cd756f4088c0730a6602bd755af52c9fe25122f494"} Mar 17 01:36:44 crc kubenswrapper[4755]: I0317 01:36:44.870780 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.870762462 podStartE2EDuration="8.870762462s" podCreationTimestamp="2026-03-17 01:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:44.866665401 +0000 UTC m=+4479.626117684" watchObservedRunningTime="2026-03-17 01:36:44.870762462 +0000 UTC m=+4479.630214745" Mar 17 01:36:44 crc kubenswrapper[4755]: I0317 01:36:44.894339 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.89432157 podStartE2EDuration="2.89432157s" podCreationTimestamp="2026-03-17 01:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:36:44.890269981 +0000 UTC m=+4479.649722264" watchObservedRunningTime="2026-03-17 01:36:44.89432157 +0000 UTC m=+4479.653773843" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.421037 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.421693 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.455340 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.474658 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.879425 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:47 crc kubenswrapper[4755]: I0317 01:36:47.879856 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:48 crc kubenswrapper[4755]: I0317 01:36:48.898537 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dtxkj" event={"ID":"7bb170d1-6e83-49fc-925f-6020490e5da7","Type":"ContainerStarted","Data":"e64456aa893a1a9200cbc8bb4329e0f5b0a256b466916d065f8c9637b7055a8c"} Mar 17 01:36:48 crc kubenswrapper[4755]: I0317 01:36:48.928726 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-dtxkj" podStartSLOduration=4.312382742 podStartE2EDuration="9.928706963s" podCreationTimestamp="2026-03-17 01:36:39 +0000 UTC" firstStartedPulling="2026-03-17 01:36:41.996190209 +0000 UTC m=+4476.755642492" lastFinishedPulling="2026-03-17 01:36:47.61251443 +0000 UTC m=+4482.371966713" observedRunningTime="2026-03-17 01:36:48.916901374 +0000 UTC m=+4483.676353657" watchObservedRunningTime="2026-03-17 01:36:48.928706963 +0000 UTC m=+4483.688159246" Mar 17 01:36:49 crc kubenswrapper[4755]: I0317 01:36:49.172077 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:36:50 crc kubenswrapper[4755]: I0317 01:36:50.110084 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:36:51 crc kubenswrapper[4755]: I0317 01:36:51.488576 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:51 crc kubenswrapper[4755]: I0317 01:36:51.519890 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.104921 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.105329 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.106516 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f6c96f776-ndhg5" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.103:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.103:8443: connect: connection refused" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.155979 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.156622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.157737 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54d5b659cb-h7mw4" podUID="18055ce1-2e32-41f8-8985-75bda9d75b01" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.104:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.104:8443: connect: connection refused" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.490651 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.490694 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.535548 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.541816 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.969181 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:36:52 crc kubenswrapper[4755]: I0317 01:36:52.969602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 01:36:55 crc kubenswrapper[4755]: I0317 01:36:55.077097 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:36:55 crc kubenswrapper[4755]: I0317 01:36:55.077499 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 01:36:55 crc kubenswrapper[4755]: I0317 01:36:55.209785 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 01:36:58 crc kubenswrapper[4755]: I0317 01:36:58.665530 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:36:58 crc kubenswrapper[4755]: I0317 01:36:58.666166 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:36:59 crc kubenswrapper[4755]: I0317 01:36:59.040825 4755 generic.go:334] "Generic (PLEG): container finished" podID="7bb170d1-6e83-49fc-925f-6020490e5da7" containerID="e64456aa893a1a9200cbc8bb4329e0f5b0a256b466916d065f8c9637b7055a8c" exitCode=0 Mar 17 01:36:59 crc kubenswrapper[4755]: I0317 01:36:59.040865 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dtxkj" event={"ID":"7bb170d1-6e83-49fc-925f-6020490e5da7","Type":"ContainerDied","Data":"e64456aa893a1a9200cbc8bb4329e0f5b0a256b466916d065f8c9637b7055a8c"} Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.698747 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dtxkj" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.761081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data\") pod \"7bb170d1-6e83-49fc-925f-6020490e5da7\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.761274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data\") pod \"7bb170d1-6e83-49fc-925f-6020490e5da7\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.761393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle\") pod \"7bb170d1-6e83-49fc-925f-6020490e5da7\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.761485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sx69\" (UniqueName: \"kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69\") pod \"7bb170d1-6e83-49fc-925f-6020490e5da7\" (UID: \"7bb170d1-6e83-49fc-925f-6020490e5da7\") " Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.768582 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "7bb170d1-6e83-49fc-925f-6020490e5da7" (UID: "7bb170d1-6e83-49fc-925f-6020490e5da7"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.771638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data" (OuterVolumeSpecName: "config-data") pod "7bb170d1-6e83-49fc-925f-6020490e5da7" (UID: "7bb170d1-6e83-49fc-925f-6020490e5da7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.791652 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69" (OuterVolumeSpecName: "kube-api-access-4sx69") pod "7bb170d1-6e83-49fc-925f-6020490e5da7" (UID: "7bb170d1-6e83-49fc-925f-6020490e5da7"). InnerVolumeSpecName "kube-api-access-4sx69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.796845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bb170d1-6e83-49fc-925f-6020490e5da7" (UID: "7bb170d1-6e83-49fc-925f-6020490e5da7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.864479 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.864514 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sx69\" (UniqueName: \"kubernetes.io/projected/7bb170d1-6e83-49fc-925f-6020490e5da7-kube-api-access-4sx69\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.864525 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:00 crc kubenswrapper[4755]: I0317 01:37:00.864533 4755 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7bb170d1-6e83-49fc-925f-6020490e5da7-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.061664 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dtxkj" event={"ID":"7bb170d1-6e83-49fc-925f-6020490e5da7","Type":"ContainerDied","Data":"922b7c25a197a46c9645be12947e40555848be1b1deae41f89e4c0bc4e8f2bf0"} Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.061700 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="922b7c25a197a46c9645be12947e40555848be1b1deae41f89e4c0bc4e8f2bf0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.061720 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dtxkj" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.333853 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: E0317 01:37:01.351258 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb170d1-6e83-49fc-925f-6020490e5da7" containerName="manila-db-sync" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.351279 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb170d1-6e83-49fc-925f-6020490e5da7" containerName="manila-db-sync" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.351510 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb170d1-6e83-49fc-925f-6020490e5da7" containerName="manila-db-sync" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.352669 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.360695 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.360866 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.360972 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-twjcr" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.361074 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.376195 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.381645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.383965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.423480 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.459252 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzsg\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477209 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477382 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477485 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq9sq\" (UniqueName: \"kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477561 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.477577 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.542674 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cnbmp"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.551728 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581562 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25r9n\" (UniqueName: \"kubernetes.io/projected/e8c45f18-80d3-466b-9abe-ebb64d80c285-kube-api-access-25r9n\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9sq\" (UniqueName: \"kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581833 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzsg\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-config\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.581990 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582121 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582148 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.582170 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.590519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.592336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.592460 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.593969 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.601252 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.601650 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cnbmp"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.608148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.620373 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9sq\" (UniqueName: \"kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.627080 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzsg\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.631257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.631993 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.640180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.640593 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.643780 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.655181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.665732 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.667871 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.669750 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.680982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.685823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.685881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.685928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.685950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kljzw\" (UniqueName: \"kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-config\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686109 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686208 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25r9n\" (UniqueName: \"kubernetes.io/projected/e8c45f18-80d3-466b-9abe-ebb64d80c285-kube-api-access-25r9n\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.686253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.687111 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-sb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.687610 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-swift-storage-0\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.688076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-openstack-edpm-ipam\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.689698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-ovsdbserver-nb\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.690670 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-dns-svc\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.696762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c45f18-80d3-466b-9abe-ebb64d80c285-config\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.697519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.702744 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.716084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25r9n\" (UniqueName: \"kubernetes.io/projected/e8c45f18-80d3-466b-9abe-ebb64d80c285-kube-api-access-25r9n\") pod \"dnsmasq-dns-74cfff99f-cnbmp\" (UID: \"e8c45f18-80d3-466b-9abe-ebb64d80c285\") " pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796451 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796592 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kljzw\" (UniqueName: \"kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796628 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796650 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.796768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.797814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.802020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.803135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.806767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.807271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.822998 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kljzw\" (UniqueName: \"kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw\") pod \"manila-api-0\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " pod="openstack/manila-api-0" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.965840 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:01 crc kubenswrapper[4755]: I0317 01:37:01.989025 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:02 crc kubenswrapper[4755]: I0317 01:37:02.106654 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f6c96f776-ndhg5" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.103:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.103:8443: connect: connection refused" Mar 17 01:37:02 crc kubenswrapper[4755]: I0317 01:37:02.571577 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:02 crc kubenswrapper[4755]: I0317 01:37:02.629234 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:02 crc kubenswrapper[4755]: W0317 01:37:02.638683 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod312ab442_b334_4cdf_813e_a63285071076.slice/crio-d20a84328f8ac6fc97be7330e894fcdad9c747f3d4c1ac5196fa9933c8ac972c WatchSource:0}: Error finding container d20a84328f8ac6fc97be7330e894fcdad9c747f3d4c1ac5196fa9933c8ac972c: Status 404 returned error can't find the container with id d20a84328f8ac6fc97be7330e894fcdad9c747f3d4c1ac5196fa9933c8ac972c Mar 17 01:37:02 crc kubenswrapper[4755]: I0317 01:37:02.790641 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cfff99f-cnbmp"] Mar 17 01:37:02 crc kubenswrapper[4755]: W0317 01:37:02.814923 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c45f18_80d3_466b_9abe_ebb64d80c285.slice/crio-31483d7bc8df31afcd14f11c9d43a0178489adc3a7451561c75160a8b5c882d2 WatchSource:0}: Error finding container 31483d7bc8df31afcd14f11c9d43a0178489adc3a7451561c75160a8b5c882d2: Status 404 returned error can't find the container with id 31483d7bc8df31afcd14f11c9d43a0178489adc3a7451561c75160a8b5c882d2 Mar 17 01:37:02 crc kubenswrapper[4755]: I0317 01:37:02.968960 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:03 crc kubenswrapper[4755]: I0317 01:37:03.085950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerStarted","Data":"d20a84328f8ac6fc97be7330e894fcdad9c747f3d4c1ac5196fa9933c8ac972c"} Mar 17 01:37:03 crc kubenswrapper[4755]: I0317 01:37:03.087565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerStarted","Data":"b3d0820690c987d430a2481a6221162e7da9725e3ea17541d52e27affa3523aa"} Mar 17 01:37:03 crc kubenswrapper[4755]: I0317 01:37:03.089430 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerStarted","Data":"37c42c270e89710e5f2c52f89f66647f19e0e5b6f6c40c6493b434219c1f4416"} Mar 17 01:37:03 crc kubenswrapper[4755]: I0317 01:37:03.091108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" event={"ID":"e8c45f18-80d3-466b-9abe-ebb64d80c285","Type":"ContainerStarted","Data":"31483d7bc8df31afcd14f11c9d43a0178489adc3a7451561c75160a8b5c882d2"} Mar 17 01:37:04 crc kubenswrapper[4755]: I0317 01:37:04.147788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerStarted","Data":"31032f99175eb53671c2233c05b06a1b3fa9f241f7ce877736b812c47751c444"} Mar 17 01:37:04 crc kubenswrapper[4755]: I0317 01:37:04.173200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerStarted","Data":"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966"} Mar 17 01:37:04 crc kubenswrapper[4755]: I0317 01:37:04.176110 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8c45f18-80d3-466b-9abe-ebb64d80c285" containerID="fdc315d7d16e5d2ddc8e2c1e33dc151cc6841a7ad45a1bf5d320a7b4289f5282" exitCode=0 Mar 17 01:37:04 crc kubenswrapper[4755]: I0317 01:37:04.176138 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" event={"ID":"e8c45f18-80d3-466b-9abe-ebb64d80c285","Type":"ContainerDied","Data":"fdc315d7d16e5d2ddc8e2c1e33dc151cc6841a7ad45a1bf5d320a7b4289f5282"} Mar 17 01:37:04 crc kubenswrapper[4755]: I0317 01:37:04.303188 4755 scope.go:117] "RemoveContainer" containerID="4fe7fd301c6c16d64a44993c6a4e44f26fe0c5db5b0d9e4964ff95732c53aca8" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.073517 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.195612 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerStarted","Data":"060c5ba9293c47d7ff816e592e33bcd983a5e3aedceebfa91556cbf4ea438692"} Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.215296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerStarted","Data":"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f"} Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.216088 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.223112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" event={"ID":"e8c45f18-80d3-466b-9abe-ebb64d80c285","Type":"ContainerStarted","Data":"5790bb9a0079b220c96bb5b15a8a7eef41a119d1b6159f07baffd238417a3877"} Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.224115 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.228941 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.573448303 podStartE2EDuration="4.228919278s" podCreationTimestamp="2026-03-17 01:37:01 +0000 UTC" firstStartedPulling="2026-03-17 01:37:02.655828137 +0000 UTC m=+4497.415280420" lastFinishedPulling="2026-03-17 01:37:03.311299102 +0000 UTC m=+4498.070751395" observedRunningTime="2026-03-17 01:37:05.216211354 +0000 UTC m=+4499.975663657" watchObservedRunningTime="2026-03-17 01:37:05.228919278 +0000 UTC m=+4499.988371561" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.250841 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.250816661 podStartE2EDuration="4.250816661s" podCreationTimestamp="2026-03-17 01:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:37:05.238289772 +0000 UTC m=+4499.997742055" watchObservedRunningTime="2026-03-17 01:37:05.250816661 +0000 UTC m=+4500.010268934" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.282138 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" podStartSLOduration=4.282114849 podStartE2EDuration="4.282114849s" podCreationTimestamp="2026-03-17 01:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:37:05.269886997 +0000 UTC m=+4500.029339280" watchObservedRunningTime="2026-03-17 01:37:05.282114849 +0000 UTC m=+4500.041567132" Mar 17 01:37:05 crc kubenswrapper[4755]: I0317 01:37:05.886600 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:37:06 crc kubenswrapper[4755]: I0317 01:37:06.235148 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api-log" containerID="cri-o://5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" gracePeriod=30 Mar 17 01:37:06 crc kubenswrapper[4755]: I0317 01:37:06.235483 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api" containerID="cri-o://eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" gracePeriod=30 Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.161590 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256177 4755 generic.go:334] "Generic (PLEG): container finished" podID="6e3bcab8-077a-40b3-b94a-532736227491" containerID="eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" exitCode=0 Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256206 4755 generic.go:334] "Generic (PLEG): container finished" podID="6e3bcab8-077a-40b3-b94a-532736227491" containerID="5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" exitCode=143 Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256280 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerDied","Data":"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f"} Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerDied","Data":"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966"} Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e3bcab8-077a-40b3-b94a-532736227491","Type":"ContainerDied","Data":"b3d0820690c987d430a2481a6221162e7da9725e3ea17541d52e27affa3523aa"} Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.256422 4755 scope.go:117] "RemoveContainer" containerID="eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.283795 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.283897 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.283992 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.284030 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.284093 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.284166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.284223 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kljzw\" (UniqueName: \"kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw\") pod \"6e3bcab8-077a-40b3-b94a-532736227491\" (UID: \"6e3bcab8-077a-40b3-b94a-532736227491\") " Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.285514 4755 scope.go:117] "RemoveContainer" containerID="5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.286646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.287419 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs" (OuterVolumeSpecName: "logs") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.290632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts" (OuterVolumeSpecName: "scripts") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.292649 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw" (OuterVolumeSpecName: "kube-api-access-kljzw") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "kube-api-access-kljzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.308457 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.329636 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386512 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3bcab8-077a-40b3-b94a-532736227491-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386538 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kljzw\" (UniqueName: \"kubernetes.io/projected/6e3bcab8-077a-40b3-b94a-532736227491-kube-api-access-kljzw\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386550 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386558 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386565 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.386573 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e3bcab8-077a-40b3-b94a-532736227491-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.397714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data" (OuterVolumeSpecName: "config-data") pod "6e3bcab8-077a-40b3-b94a-532736227491" (UID: "6e3bcab8-077a-40b3-b94a-532736227491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.479508 4755 scope.go:117] "RemoveContainer" containerID="eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" Mar 17 01:37:07 crc kubenswrapper[4755]: E0317 01:37:07.479966 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f\": container with ID starting with eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f not found: ID does not exist" containerID="eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.480008 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f"} err="failed to get container status \"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f\": rpc error: code = NotFound desc = could not find container \"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f\": container with ID starting with eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f not found: ID does not exist" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.480034 4755 scope.go:117] "RemoveContainer" containerID="5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" Mar 17 01:37:07 crc kubenswrapper[4755]: E0317 01:37:07.480528 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966\": container with ID starting with 5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966 not found: ID does not exist" containerID="5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.480566 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966"} err="failed to get container status \"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966\": rpc error: code = NotFound desc = could not find container \"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966\": container with ID starting with 5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966 not found: ID does not exist" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.480593 4755 scope.go:117] "RemoveContainer" containerID="eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.484947 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f"} err="failed to get container status \"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f\": rpc error: code = NotFound desc = could not find container \"eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f\": container with ID starting with eaae930e123ad9beb67cf87d6182d52121a0d1868f9904c5f0eea136e2b3f91f not found: ID does not exist" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.484987 4755 scope.go:117] "RemoveContainer" containerID="5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.485367 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966"} err="failed to get container status \"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966\": rpc error: code = NotFound desc = could not find container \"5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966\": container with ID starting with 5949232db1e4eba879a36b5f49f960751b45a55eb65e89f01be37611efc0e966 not found: ID does not exist" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.488864 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3bcab8-077a-40b3-b94a-532736227491-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.612981 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.628040 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.639834 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:07 crc kubenswrapper[4755]: E0317 01:37:07.640392 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api-log" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.640409 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api-log" Mar 17 01:37:07 crc kubenswrapper[4755]: E0317 01:37:07.640455 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.640463 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.640706 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api-log" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.640721 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3bcab8-077a-40b3-b94a-532736227491" containerName="manila-api" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.641969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.650214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.652014 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.652032 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.652156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794040 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-internal-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794106 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794125 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794161 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-scripts\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data-custom\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794189 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/763b47f1-98b0-4ebc-970c-adfcac1aee29-etc-machine-id\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794208 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vdb\" (UniqueName: \"kubernetes.io/projected/763b47f1-98b0-4ebc-970c-adfcac1aee29-kube-api-access-j4vdb\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763b47f1-98b0-4ebc-970c-adfcac1aee29-logs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.794256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-public-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.877031 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54d5b659cb-h7mw4" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.895916 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.895956 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896011 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-scripts\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data-custom\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/763b47f1-98b0-4ebc-970c-adfcac1aee29-etc-machine-id\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896062 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vdb\" (UniqueName: \"kubernetes.io/projected/763b47f1-98b0-4ebc-970c-adfcac1aee29-kube-api-access-j4vdb\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763b47f1-98b0-4ebc-970c-adfcac1aee29-logs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-public-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/763b47f1-98b0-4ebc-970c-adfcac1aee29-etc-machine-id\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.896247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-internal-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.897988 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/763b47f1-98b0-4ebc-970c-adfcac1aee29-logs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.901815 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-public-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.904293 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.904509 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-scripts\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.906657 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.909122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-config-data-custom\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.918257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vdb\" (UniqueName: \"kubernetes.io/projected/763b47f1-98b0-4ebc-970c-adfcac1aee29-kube-api-access-j4vdb\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.921155 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b47f1-98b0-4ebc-970c-adfcac1aee29-internal-tls-certs\") pod \"manila-api-0\" (UID: \"763b47f1-98b0-4ebc-970c-adfcac1aee29\") " pod="openstack/manila-api-0" Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.967785 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.968245 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f6c96f776-ndhg5" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon-log" containerID="cri-o://4a701b4e9a067233672effaac585e80e05c9b1c0d3dd8d44dda4a23be5c25fa0" gracePeriod=30 Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.968718 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f6c96f776-ndhg5" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" containerID="cri-o://f7567997308408dbb2139de2595e00ba6ae6735c9cda2535a531582128a9487a" gracePeriod=30 Mar 17 01:37:07 crc kubenswrapper[4755]: I0317 01:37:07.976754 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 17 01:37:08 crc kubenswrapper[4755]: I0317 01:37:08.270240 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3bcab8-077a-40b3-b94a-532736227491" path="/var/lib/kubelet/pods/6e3bcab8-077a-40b3-b94a-532736227491/volumes" Mar 17 01:37:08 crc kubenswrapper[4755]: I0317 01:37:08.273629 4755 generic.go:334] "Generic (PLEG): container finished" podID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerID="f7567997308408dbb2139de2595e00ba6ae6735c9cda2535a531582128a9487a" exitCode=0 Mar 17 01:37:08 crc kubenswrapper[4755]: I0317 01:37:08.273666 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerDied","Data":"f7567997308408dbb2139de2595e00ba6ae6735c9cda2535a531582128a9487a"} Mar 17 01:37:08 crc kubenswrapper[4755]: I0317 01:37:08.612727 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 17 01:37:09 crc kubenswrapper[4755]: I0317 01:37:09.290474 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"763b47f1-98b0-4ebc-970c-adfcac1aee29","Type":"ContainerStarted","Data":"7543d1a8b541cb95ed708e7b66259be92d3770e81358ae386639cd5ebf365b4e"} Mar 17 01:37:09 crc kubenswrapper[4755]: I0317 01:37:09.290966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"763b47f1-98b0-4ebc-970c-adfcac1aee29","Type":"ContainerStarted","Data":"e45a1d987c089771134182a1d648910ed791ea658ffed73dc8edfdb97f6f994f"} Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.063931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.064256 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-notification-agent" containerID="cri-o://a0609fe4dc099aa0969597a5f395df67ab85042e2479a4b5f9cb65e614573d13" gracePeriod=30 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.064304 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-central-agent" containerID="cri-o://d3b98116772836c60e1e08a9360ec8f9b82ffe2202f53e426b293edf4ae83dc9" gracePeriod=30 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.064312 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="proxy-httpd" containerID="cri-o://224e14dbc7eec156efe79008bd2fa069313ff6fe8b20055b99bf0fdaad290fea" gracePeriod=30 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.064370 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="sg-core" containerID="cri-o://70b650ccdb95eef8fd44e4bfec6fbf5cfa3b969796c45e8ded6e9ee8fa0309c3" gracePeriod=30 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.305203 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerID="224e14dbc7eec156efe79008bd2fa069313ff6fe8b20055b99bf0fdaad290fea" exitCode=0 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.305503 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerID="70b650ccdb95eef8fd44e4bfec6fbf5cfa3b969796c45e8ded6e9ee8fa0309c3" exitCode=2 Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.305280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerDied","Data":"224e14dbc7eec156efe79008bd2fa069313ff6fe8b20055b99bf0fdaad290fea"} Mar 17 01:37:10 crc kubenswrapper[4755]: I0317 01:37:10.305548 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerDied","Data":"70b650ccdb95eef8fd44e4bfec6fbf5cfa3b969796c45e8ded6e9ee8fa0309c3"} Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.329901 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerID="d3b98116772836c60e1e08a9360ec8f9b82ffe2202f53e426b293edf4ae83dc9" exitCode=0 Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.329933 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerID="a0609fe4dc099aa0969597a5f395df67ab85042e2479a4b5f9cb65e614573d13" exitCode=0 Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.329950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerDied","Data":"d3b98116772836c60e1e08a9360ec8f9b82ffe2202f53e426b293edf4ae83dc9"} Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.329974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerDied","Data":"a0609fe4dc099aa0969597a5f395df67ab85042e2479a4b5f9cb65e614573d13"} Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.682089 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 17 01:37:11 crc kubenswrapper[4755]: I0317 01:37:11.967693 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74cfff99f-cnbmp" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.026533 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.026927 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768b698657-5qzjn" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="dnsmasq-dns" containerID="cri-o://b7e8f96ed0a426482364cb2ffb55613f5d9662c6ba6e9a1b512dbf7566ce288c" gracePeriod=10 Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.402348 4755 generic.go:334] "Generic (PLEG): container finished" podID="7cda8da0-db77-49cd-b85f-06335137c116" containerID="b7e8f96ed0a426482364cb2ffb55613f5d9662c6ba6e9a1b512dbf7566ce288c" exitCode=0 Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.402613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-5qzjn" event={"ID":"7cda8da0-db77-49cd-b85f-06335137c116","Type":"ContainerDied","Data":"b7e8f96ed0a426482364cb2ffb55613f5d9662c6ba6e9a1b512dbf7566ce288c"} Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.644006 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.725783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726113 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726174 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7xb\" (UniqueName: \"kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726416 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.726544 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc\") pod \"7cda8da0-db77-49cd-b85f-06335137c116\" (UID: \"7cda8da0-db77-49cd-b85f-06335137c116\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.744701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb" (OuterVolumeSpecName: "kube-api-access-pl7xb") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "kube-api-access-pl7xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.768667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.830639 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7xb\" (UniqueName: \"kubernetes.io/projected/7cda8da0-db77-49cd-b85f-06335137c116-kube-api-access-pl7xb\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933361 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933599 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.933969 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.934052 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hwj\" (UniqueName: \"kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.934121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs\") pod \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\" (UID: \"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e\") " Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.934661 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.935360 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:12 crc kubenswrapper[4755]: I0317 01:37:12.935844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.003749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts" (OuterVolumeSpecName: "scripts") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.038097 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.038341 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.043608 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj" (OuterVolumeSpecName: "kube-api-access-67hwj") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "kube-api-access-67hwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.152382 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67hwj\" (UniqueName: \"kubernetes.io/projected/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-kube-api-access-67hwj\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.161258 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config" (OuterVolumeSpecName: "config") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.254997 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-config\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.415700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e","Type":"ContainerDied","Data":"aa492a567e9c8341e9bf5c53c076ca5aa8d27a85fd0e1f73fc1f187b6c92400f"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.415748 4755 scope.go:117] "RemoveContainer" containerID="d3b98116772836c60e1e08a9360ec8f9b82ffe2202f53e426b293edf4ae83dc9" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.415884 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.428777 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerID="8c565b99dfa52270938fcbab36b28ed8c634145d0d1a43d1250f3059aaf578f3" exitCode=137 Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.428818 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerID="f8b84d883c2a5eea4285f7ebf29e74441a694c34589be269db2058965460d972" exitCode=137 Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.428898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerDied","Data":"8c565b99dfa52270938fcbab36b28ed8c634145d0d1a43d1250f3059aaf578f3"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.428923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerDied","Data":"f8b84d883c2a5eea4285f7ebf29e74441a694c34589be269db2058965460d972"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.430670 4755 generic.go:334] "Generic (PLEG): container finished" podID="56666739-d2a6-4842-8d9c-27ad101c9253" containerID="8e2429255ffe8d98eaa405e5ff5c7a007e00d8a35c002653e45b7da036a09622" exitCode=137 Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.430686 4755 generic.go:334] "Generic (PLEG): container finished" podID="56666739-d2a6-4842-8d9c-27ad101c9253" containerID="f670af933b31158e9096ecca49bd596686ce2a1c9788be58ac319e50c1603f02" exitCode=137 Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.430716 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerDied","Data":"8e2429255ffe8d98eaa405e5ff5c7a007e00d8a35c002653e45b7da036a09622"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.430732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerDied","Data":"f670af933b31158e9096ecca49bd596686ce2a1c9788be58ac319e50c1603f02"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.452289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768b698657-5qzjn" event={"ID":"7cda8da0-db77-49cd-b85f-06335137c116","Type":"ContainerDied","Data":"360ed184e931f4d6d67464c2d9b551e2f1fd1630946317d4406ed75f5df77b74"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.452311 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768b698657-5qzjn" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.464427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"763b47f1-98b0-4ebc-970c-adfcac1aee29","Type":"ContainerStarted","Data":"b32e9d658b6686990955f5ae9fc7158f8a004ed8a26a64f2596ff4fc7c01872b"} Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.466222 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.964981 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.984348 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.987825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:13 crc kubenswrapper[4755]: I0317 01:37:13.990930 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.002491 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7cda8da0-db77-49cd-b85f-06335137c116" (UID: "7cda8da0-db77-49cd-b85f-06335137c116"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.017240 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.017272 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.017282 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.017292 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.017302 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cda8da0-db77-49cd-b85f-06335137c116-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.028169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.119900 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.267397 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.277174 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.306832 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data" (OuterVolumeSpecName: "config-data") pod "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" (UID: "2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.325830 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.325860 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.325869 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.385308 4755 scope.go:117] "RemoveContainer" containerID="224e14dbc7eec156efe79008bd2fa069313ff6fe8b20055b99bf0fdaad290fea" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.408849 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.426285 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.426847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data\") pod \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.426922 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs\") pod \"56666739-d2a6-4842-8d9c-27ad101c9253\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data\") pod \"56666739-d2a6-4842-8d9c-27ad101c9253\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427066 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key\") pod \"56666739-d2a6-4842-8d9c-27ad101c9253\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffk6\" (UniqueName: \"kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6\") pod \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key\") pod \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427240 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts\") pod \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts\") pod \"56666739-d2a6-4842-8d9c-27ad101c9253\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6987\" (UniqueName: \"kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987\") pod \"56666739-d2a6-4842-8d9c-27ad101c9253\" (UID: \"56666739-d2a6-4842-8d9c-27ad101c9253\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.427420 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs\") pod \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\" (UID: \"3a9d2fca-c4bb-4822-b6cc-78c30de14b99\") " Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.428323 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs" (OuterVolumeSpecName: "logs") pod "3a9d2fca-c4bb-4822-b6cc-78c30de14b99" (UID: "3a9d2fca-c4bb-4822-b6cc-78c30de14b99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.436229 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs" (OuterVolumeSpecName: "logs") pod "56666739-d2a6-4842-8d9c-27ad101c9253" (UID: "56666739-d2a6-4842-8d9c-27ad101c9253"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.476757 4755 scope.go:117] "RemoveContainer" containerID="70b650ccdb95eef8fd44e4bfec6fbf5cfa3b969796c45e8ded6e9ee8fa0309c3" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.485023 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=7.485000258 podStartE2EDuration="7.485000258s" podCreationTimestamp="2026-03-17 01:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:37:13.488170331 +0000 UTC m=+4508.247622624" watchObservedRunningTime="2026-03-17 01:37:14.485000258 +0000 UTC m=+4509.244452541" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.489349 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6" (OuterVolumeSpecName: "kube-api-access-pffk6") pod "3a9d2fca-c4bb-4822-b6cc-78c30de14b99" (UID: "3a9d2fca-c4bb-4822-b6cc-78c30de14b99"). InnerVolumeSpecName "kube-api-access-pffk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.489464 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3a9d2fca-c4bb-4822-b6cc-78c30de14b99" (UID: "3a9d2fca-c4bb-4822-b6cc-78c30de14b99"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.492941 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data" (OuterVolumeSpecName: "config-data") pod "56666739-d2a6-4842-8d9c-27ad101c9253" (UID: "56666739-d2a6-4842-8d9c-27ad101c9253"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.497711 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "56666739-d2a6-4842-8d9c-27ad101c9253" (UID: "56666739-d2a6-4842-8d9c-27ad101c9253"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.500635 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987" (OuterVolumeSpecName: "kube-api-access-n6987") pod "56666739-d2a6-4842-8d9c-27ad101c9253" (UID: "56666739-d2a6-4842-8d9c-27ad101c9253"). InnerVolumeSpecName "kube-api-access-n6987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.530674 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pffk6\" (UniqueName: \"kubernetes.io/projected/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-kube-api-access-pffk6\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.530929 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.530996 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6987\" (UniqueName: \"kubernetes.io/projected/56666739-d2a6-4842-8d9c-27ad101c9253-kube-api-access-n6987\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.531052 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.531115 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56666739-d2a6-4842-8d9c-27ad101c9253-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.531177 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.531237 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56666739-d2a6-4842-8d9c-27ad101c9253-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.582790 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts" (OuterVolumeSpecName: "scripts") pod "3a9d2fca-c4bb-4822-b6cc-78c30de14b99" (UID: "3a9d2fca-c4bb-4822-b6cc-78c30de14b99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.583123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587c6f496f-f97jz" event={"ID":"3a9d2fca-c4bb-4822-b6cc-78c30de14b99","Type":"ContainerDied","Data":"bcc9ce421f395834bc05391d104b9d40ca7221c0f53095cf9d3bf00c9c4e8464"} Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.583243 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587c6f496f-f97jz" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.599636 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.604531 4755 scope.go:117] "RemoveContainer" containerID="a0609fe4dc099aa0969597a5f395df67ab85042e2479a4b5f9cb65e614573d13" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.604707 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-656df98fd5-c4xrz" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.604847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-656df98fd5-c4xrz" event={"ID":"56666739-d2a6-4842-8d9c-27ad101c9253","Type":"ContainerDied","Data":"3f720d9061a778cbfe2f9d387eeec900fffe8823647c44394ce30cc3e3ad70b2"} Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.618520 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts" (OuterVolumeSpecName: "scripts") pod "56666739-d2a6-4842-8d9c-27ad101c9253" (UID: "56666739-d2a6-4842-8d9c-27ad101c9253"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.619881 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.620678 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerStarted","Data":"bb173aa6069a33cbb40a3819af62c745a5830bb5b5fa9685ef94d526df303224"} Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.627825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data" (OuterVolumeSpecName: "config-data") pod "3a9d2fca-c4bb-4822-b6cc-78c30de14b99" (UID: "3a9d2fca-c4bb-4822-b6cc-78c30de14b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.633753 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.633951 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a9d2fca-c4bb-4822-b6cc-78c30de14b99-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.634044 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56666739-d2a6-4842-8d9c-27ad101c9253-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.639765 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.650370 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651097 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651162 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651249 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="dnsmasq-dns" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651301 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="dnsmasq-dns" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651359 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651418 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651490 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651540 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651606 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-central-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651660 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-central-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651715 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="init" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651762 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="init" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651826 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-notification-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651879 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-notification-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.651934 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="proxy-httpd" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.651988 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="proxy-httpd" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.652040 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652088 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: E0317 01:37:14.652153 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="sg-core" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652201 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="sg-core" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652485 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652606 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="proxy-httpd" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652667 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-notification-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652733 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="sg-core" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652789 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon-log" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652848 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652903 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" containerName="ceilometer-central-agent" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.652963 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cda8da0-db77-49cd-b85f-06335137c116" containerName="dnsmasq-dns" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.653021 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" containerName="horizon" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.656566 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.657915 4755 scope.go:117] "RemoveContainer" containerID="b7e8f96ed0a426482364cb2ffb55613f5d9662c6ba6e9a1b512dbf7566ce288c" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.660735 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768b698657-5qzjn"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.661032 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.661207 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.661237 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.680825 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.687476 4755 scope.go:117] "RemoveContainer" containerID="409f2265631c6dcf4bc9166a1cbe58e6f5c0ea6017166c23defbb991feecdd6f" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.715617 4755 scope.go:117] "RemoveContainer" containerID="8c565b99dfa52270938fcbab36b28ed8c634145d0d1a43d1250f3059aaf578f3" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737025 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxw2\" (UniqueName: \"kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.737368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839134 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxw2\" (UniqueName: \"kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839297 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839402 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.839867 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.840955 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.845069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.845367 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.845631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.847036 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.849396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.863060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxw2\" (UniqueName: \"kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2\") pod \"ceilometer-0\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.910071 4755 scope.go:117] "RemoveContainer" containerID="f8b84d883c2a5eea4285f7ebf29e74441a694c34589be269db2058965460d972" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.948507 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.971318 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-587c6f496f-f97jz"] Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.976667 4755 scope.go:117] "RemoveContainer" containerID="8e2429255ffe8d98eaa405e5ff5c7a007e00d8a35c002653e45b7da036a09622" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.991539 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:14 crc kubenswrapper[4755]: I0317 01:37:14.995313 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:37:15 crc kubenswrapper[4755]: I0317 01:37:15.005062 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-656df98fd5-c4xrz"] Mar 17 01:37:15 crc kubenswrapper[4755]: I0317 01:37:15.655418 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerStarted","Data":"495aa7169366e270908a5d0c4a6d859318a550523be422c8555daa69b58b0cb0"} Mar 17 01:37:15 crc kubenswrapper[4755]: I0317 01:37:15.684279 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.11659835 podStartE2EDuration="14.684258715s" podCreationTimestamp="2026-03-17 01:37:01 +0000 UTC" firstStartedPulling="2026-03-17 01:37:02.583023976 +0000 UTC m=+4497.342476259" lastFinishedPulling="2026-03-17 01:37:12.150684321 +0000 UTC m=+4506.910136624" observedRunningTime="2026-03-17 01:37:15.677106671 +0000 UTC m=+4510.436558974" watchObservedRunningTime="2026-03-17 01:37:15.684258715 +0000 UTC m=+4510.443710998" Mar 17 01:37:15 crc kubenswrapper[4755]: I0317 01:37:15.939523 4755 scope.go:117] "RemoveContainer" containerID="f670af933b31158e9096ecca49bd596686ce2a1c9788be58ac319e50c1603f02" Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.269769 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e" path="/var/lib/kubelet/pods/2f5fa9ba-d7d7-4b30-a3a8-97a41ddc9c1e/volumes" Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.271375 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9d2fca-c4bb-4822-b6cc-78c30de14b99" path="/var/lib/kubelet/pods/3a9d2fca-c4bb-4822-b6cc-78c30de14b99/volumes" Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.273279 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56666739-d2a6-4842-8d9c-27ad101c9253" path="/var/lib/kubelet/pods/56666739-d2a6-4842-8d9c-27ad101c9253/volumes" Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.274299 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cda8da0-db77-49cd-b85f-06335137c116" path="/var/lib/kubelet/pods/7cda8da0-db77-49cd-b85f-06335137c116/volumes" Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.529695 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:16 crc kubenswrapper[4755]: I0317 01:37:16.671520 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerStarted","Data":"b71f2c57f1664a0d6fb66dd41966995483e55878a4cb6a92a06f8f283b914e7a"} Mar 17 01:37:17 crc kubenswrapper[4755]: I0317 01:37:17.509876 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:17 crc kubenswrapper[4755]: I0317 01:37:17.683528 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerStarted","Data":"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626"} Mar 17 01:37:17 crc kubenswrapper[4755]: I0317 01:37:17.683573 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerStarted","Data":"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745"} Mar 17 01:37:18 crc kubenswrapper[4755]: I0317 01:37:18.698079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerStarted","Data":"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831"} Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.699107 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.743424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerStarted","Data":"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4"} Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.743602 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-central-agent" containerID="cri-o://e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745" gracePeriod=30 Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.743667 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.744024 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="proxy-httpd" containerID="cri-o://c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4" gracePeriod=30 Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.744076 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="sg-core" containerID="cri-o://173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831" gracePeriod=30 Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.744128 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-notification-agent" containerID="cri-o://5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626" gracePeriod=30 Mar 17 01:37:21 crc kubenswrapper[4755]: I0317 01:37:21.791518 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.757223917 podStartE2EDuration="7.791497526s" podCreationTimestamp="2026-03-17 01:37:14 +0000 UTC" firstStartedPulling="2026-03-17 01:37:16.51772461 +0000 UTC m=+4511.277176893" lastFinishedPulling="2026-03-17 01:37:20.551998199 +0000 UTC m=+4515.311450502" observedRunningTime="2026-03-17 01:37:21.773942421 +0000 UTC m=+4516.533394734" watchObservedRunningTime="2026-03-17 01:37:21.791497526 +0000 UTC m=+4516.550949809" Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768192 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerID="c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4" exitCode=0 Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768884 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerID="173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831" exitCode=2 Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768894 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerID="5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626" exitCode=0 Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768891 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerDied","Data":"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4"} Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768965 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerDied","Data":"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831"} Mar 17 01:37:22 crc kubenswrapper[4755]: I0317 01:37:22.768980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerDied","Data":"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626"} Mar 17 01:37:23 crc kubenswrapper[4755]: I0317 01:37:23.208923 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 17 01:37:23 crc kubenswrapper[4755]: I0317 01:37:23.256810 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:23 crc kubenswrapper[4755]: I0317 01:37:23.777162 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="manila-scheduler" containerID="cri-o://31032f99175eb53671c2233c05b06a1b3fa9f241f7ce877736b812c47751c444" gracePeriod=30 Mar 17 01:37:23 crc kubenswrapper[4755]: I0317 01:37:23.777857 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="probe" containerID="cri-o://060c5ba9293c47d7ff816e592e33bcd983a5e3aedceebfa91556cbf4ea438692" gracePeriod=30 Mar 17 01:37:24 crc kubenswrapper[4755]: I0317 01:37:24.789164 4755 generic.go:334] "Generic (PLEG): container finished" podID="312ab442-b334-4cdf-813e-a63285071076" containerID="060c5ba9293c47d7ff816e592e33bcd983a5e3aedceebfa91556cbf4ea438692" exitCode=0 Mar 17 01:37:24 crc kubenswrapper[4755]: I0317 01:37:24.789593 4755 generic.go:334] "Generic (PLEG): container finished" podID="312ab442-b334-4cdf-813e-a63285071076" containerID="31032f99175eb53671c2233c05b06a1b3fa9f241f7ce877736b812c47751c444" exitCode=0 Mar 17 01:37:24 crc kubenswrapper[4755]: I0317 01:37:24.789612 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerDied","Data":"060c5ba9293c47d7ff816e592e33bcd983a5e3aedceebfa91556cbf4ea438692"} Mar 17 01:37:24 crc kubenswrapper[4755]: I0317 01:37:24.789635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerDied","Data":"31032f99175eb53671c2233c05b06a1b3fa9f241f7ce877736b812c47751c444"} Mar 17 01:37:24 crc kubenswrapper[4755]: I0317 01:37:24.980368 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084338 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq9sq\" (UniqueName: \"kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084407 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts\") pod \"312ab442-b334-4cdf-813e-a63285071076\" (UID: \"312ab442-b334-4cdf-813e-a63285071076\") " Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.084849 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.085515 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312ab442-b334-4cdf-813e-a63285071076-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.090538 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.098446 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq" (OuterVolumeSpecName: "kube-api-access-jq9sq") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "kube-api-access-jq9sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.118739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts" (OuterVolumeSpecName: "scripts") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.155453 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.187403 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.187435 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq9sq\" (UniqueName: \"kubernetes.io/projected/312ab442-b334-4cdf-813e-a63285071076-kube-api-access-jq9sq\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.187445 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.187466 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.252703 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data" (OuterVolumeSpecName: "config-data") pod "312ab442-b334-4cdf-813e-a63285071076" (UID: "312ab442-b334-4cdf-813e-a63285071076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.290433 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312ab442-b334-4cdf-813e-a63285071076-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.804817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"312ab442-b334-4cdf-813e-a63285071076","Type":"ContainerDied","Data":"d20a84328f8ac6fc97be7330e894fcdad9c747f3d4c1ac5196fa9933c8ac972c"} Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.805127 4755 scope.go:117] "RemoveContainer" containerID="060c5ba9293c47d7ff816e592e33bcd983a5e3aedceebfa91556cbf4ea438692" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.805245 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.842676 4755 scope.go:117] "RemoveContainer" containerID="31032f99175eb53671c2233c05b06a1b3fa9f241f7ce877736b812c47751c444" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.856400 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.869492 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.881728 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:25 crc kubenswrapper[4755]: E0317 01:37:25.882334 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="probe" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.882356 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="probe" Mar 17 01:37:25 crc kubenswrapper[4755]: E0317 01:37:25.882394 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="manila-scheduler" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.882404 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="manila-scheduler" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.882690 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="manila-scheduler" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.882724 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="312ab442-b334-4cdf-813e-a63285071076" containerName="probe" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.884383 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.887514 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 17 01:37:25 crc kubenswrapper[4755]: I0317 01:37:25.893383 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.005395 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsx22\" (UniqueName: \"kubernetes.io/projected/725e1c02-2eca-44c3-8147-8976b9742412-kube-api-access-dsx22\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.005820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/725e1c02-2eca-44c3-8147-8976b9742412-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.005875 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-scripts\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.006314 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.006390 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.006428 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.108840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.108927 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsx22\" (UniqueName: \"kubernetes.io/projected/725e1c02-2eca-44c3-8147-8976b9742412-kube-api-access-dsx22\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.109084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/725e1c02-2eca-44c3-8147-8976b9742412-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.109111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-scripts\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.109268 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.109302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.109548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/725e1c02-2eca-44c3-8147-8976b9742412-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.114142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.114227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.114413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-scripts\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.116670 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725e1c02-2eca-44c3-8147-8976b9742412-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.130834 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsx22\" (UniqueName: \"kubernetes.io/projected/725e1c02-2eca-44c3-8147-8976b9742412-kube-api-access-dsx22\") pod \"manila-scheduler-0\" (UID: \"725e1c02-2eca-44c3-8147-8976b9742412\") " pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.211676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.270031 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312ab442-b334-4cdf-813e-a63285071076" path="/var/lib/kubelet/pods/312ab442-b334-4cdf-813e-a63285071076/volumes" Mar 17 01:37:26 crc kubenswrapper[4755]: I0317 01:37:26.808708 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.510169 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.688880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.688974 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689220 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689291 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxw2\" (UniqueName: \"kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2\") pod \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\" (UID: \"5c6a8530-167d-4e94-973c-dfeda5f3bec1\") " Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689432 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.689942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.690131 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.690150 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c6a8530-167d-4e94-973c-dfeda5f3bec1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.694991 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2" (OuterVolumeSpecName: "kube-api-access-9hxw2") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "kube-api-access-9hxw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.695629 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts" (OuterVolumeSpecName: "scripts") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.760465 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.773040 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.791957 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.791987 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxw2\" (UniqueName: \"kubernetes.io/projected/5c6a8530-167d-4e94-973c-dfeda5f3bec1-kube-api-access-9hxw2\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.792000 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.792008 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.809703 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.824672 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data" (OuterVolumeSpecName: "config-data") pod "5c6a8530-167d-4e94-973c-dfeda5f3bec1" (UID: "5c6a8530-167d-4e94-973c-dfeda5f3bec1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.844387 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerID="e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745" exitCode=0 Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.844462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerDied","Data":"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745"} Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.844492 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c6a8530-167d-4e94-973c-dfeda5f3bec1","Type":"ContainerDied","Data":"b71f2c57f1664a0d6fb66dd41966995483e55878a4cb6a92a06f8f283b914e7a"} Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.844508 4755 scope.go:117] "RemoveContainer" containerID="c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.844636 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.849009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"725e1c02-2eca-44c3-8147-8976b9742412","Type":"ContainerStarted","Data":"04a29831db24b886f48465763529d770e2c62f7185d8525f4145db185cb0514c"} Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.849045 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"725e1c02-2eca-44c3-8147-8976b9742412","Type":"ContainerStarted","Data":"9ce23aa89b75ada4db84df7698adbb69cbd5a6f2d5bdff771b9c521ae0d73407"} Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.894591 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.895447 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c6a8530-167d-4e94-973c-dfeda5f3bec1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:27 crc kubenswrapper[4755]: I0317 01:37:27.967551 4755 scope.go:117] "RemoveContainer" containerID="173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.005938 4755 scope.go:117] "RemoveContainer" containerID="5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.010617 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.038588 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.048656 4755 scope.go:117] "RemoveContainer" containerID="e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.058349 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.058868 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="sg-core" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.058890 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="sg-core" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.058910 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="proxy-httpd" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.058917 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="proxy-httpd" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.058952 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-notification-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.058959 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-notification-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.058973 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-central-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.058978 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-central-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.059163 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-central-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.059179 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="ceilometer-notification-agent" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.059198 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="sg-core" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.059210 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" containerName="proxy-httpd" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.061282 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.066566 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.066614 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.066710 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.066819 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.094037 4755 scope.go:117] "RemoveContainer" containerID="c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.096071 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4\": container with ID starting with c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4 not found: ID does not exist" containerID="c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.096114 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4"} err="failed to get container status \"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4\": rpc error: code = NotFound desc = could not find container \"c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4\": container with ID starting with c8520e956d33d74b0a6ef9bb15bd8940f50402baf04091ab71684215013c53b4 not found: ID does not exist" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.096141 4755 scope.go:117] "RemoveContainer" containerID="173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.097498 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831\": container with ID starting with 173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831 not found: ID does not exist" containerID="173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.097528 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831"} err="failed to get container status \"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831\": rpc error: code = NotFound desc = could not find container \"173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831\": container with ID starting with 173e91a4252aa8b88b69077c12e6e79362a5b8618182b081d935b7a3143b3831 not found: ID does not exist" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.097549 4755 scope.go:117] "RemoveContainer" containerID="5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.098395 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626\": container with ID starting with 5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626 not found: ID does not exist" containerID="5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.098461 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626"} err="failed to get container status \"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626\": rpc error: code = NotFound desc = could not find container \"5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626\": container with ID starting with 5584c1e5cd41f1995548bb9018be470ebd3052a3c0a2d172d6ced86284dea626 not found: ID does not exist" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.098493 4755 scope.go:117] "RemoveContainer" containerID="e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.099206 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745\": container with ID starting with e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745 not found: ID does not exist" containerID="e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.099241 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745"} err="failed to get container status \"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745\": rpc error: code = NotFound desc = could not find container \"e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745\": container with ID starting with e3a9b44ec454cdfa80134fd1b3d6a9add7ba1013db62487fb167ab79d9962745 not found: ID does not exist" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.201969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-scripts\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-run-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn5fc\" (UniqueName: \"kubernetes.io/projected/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-kube-api-access-fn5fc\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202261 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-config-data\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202324 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.202357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-log-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.260099 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6a8530-167d-4e94-973c-dfeda5f3bec1" path="/var/lib/kubelet/pods/5c6a8530-167d-4e94-973c-dfeda5f3bec1/volumes" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304410 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-scripts\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304637 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-run-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn5fc\" (UniqueName: \"kubernetes.io/projected/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-kube-api-access-fn5fc\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-config-data\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.304884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-log-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.305100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-run-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.305409 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-log-httpd\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.310625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-config-data\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.311341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-scripts\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.313502 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.323014 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn5fc\" (UniqueName: \"kubernetes.io/projected/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-kube-api-access-fn5fc\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.326665 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.332268 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2948ff39-a68c-4ef2-a7d7-8eb126261ff9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2948ff39-a68c-4ef2-a7d7-8eb126261ff9\") " pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.394028 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.664917 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.665220 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.665550 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.666383 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.666439 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" gracePeriod=600 Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.791780 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.862790 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" exitCode=0 Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.862844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f"} Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.862872 4755 scope.go:117] "RemoveContainer" containerID="b64780cce3beec691d26c9f9a13a74c3cd86654dcb8f831472c1e2735f352af6" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.863620 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:37:28 crc kubenswrapper[4755]: E0317 01:37:28.863942 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.877109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"725e1c02-2eca-44c3-8147-8976b9742412","Type":"ContainerStarted","Data":"0106c78ad324208b60c719132be6ee477726e54893c62fba377a81a0d4e74e4f"} Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.931992 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 01:37:28 crc kubenswrapper[4755]: I0317 01:37:28.972710 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.972691092 podStartE2EDuration="3.972691092s" podCreationTimestamp="2026-03-17 01:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:37:28.925567037 +0000 UTC m=+4523.685019320" watchObservedRunningTime="2026-03-17 01:37:28.972691092 +0000 UTC m=+4523.732143375" Mar 17 01:37:29 crc kubenswrapper[4755]: I0317 01:37:29.352947 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 17 01:37:29 crc kubenswrapper[4755]: I0317 01:37:29.921971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2948ff39-a68c-4ef2-a7d7-8eb126261ff9","Type":"ContainerStarted","Data":"160739ef1e98f961888391fd2cbf785c82c4db46ffcc78334f879a12612f2b15"} Mar 17 01:37:29 crc kubenswrapper[4755]: I0317 01:37:29.922257 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2948ff39-a68c-4ef2-a7d7-8eb126261ff9","Type":"ContainerStarted","Data":"dfa8a1965717b35deb8f5c6835e27c085a074116c5baff00b262e8e16b03fa8a"} Mar 17 01:37:30 crc kubenswrapper[4755]: I0317 01:37:30.941964 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2948ff39-a68c-4ef2-a7d7-8eb126261ff9","Type":"ContainerStarted","Data":"a2ce3bcb892d35d7371d79f504a6e36e96acd7780c660f558c330ca2329d6c90"} Mar 17 01:37:30 crc kubenswrapper[4755]: I0317 01:37:30.942550 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2948ff39-a68c-4ef2-a7d7-8eb126261ff9","Type":"ContainerStarted","Data":"ba68902e4bb3d0cc338ce022dce512e06223067484961b1306a6c1edd088ffba"} Mar 17 01:37:32 crc kubenswrapper[4755]: I0317 01:37:32.967634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2948ff39-a68c-4ef2-a7d7-8eb126261ff9","Type":"ContainerStarted","Data":"c1b85db5cf95b6ce3b82ac9a6ea1d97a696d301df315dcf29de643541cb8a363"} Mar 17 01:37:32 crc kubenswrapper[4755]: I0317 01:37:32.968467 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 01:37:33 crc kubenswrapper[4755]: I0317 01:37:33.163130 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 17 01:37:33 crc kubenswrapper[4755]: I0317 01:37:33.186050 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.469618135 podStartE2EDuration="6.18602949s" podCreationTimestamp="2026-03-17 01:37:27 +0000 UTC" firstStartedPulling="2026-03-17 01:37:28.893057236 +0000 UTC m=+4523.652509549" lastFinishedPulling="2026-03-17 01:37:32.609468621 +0000 UTC m=+4527.368920904" observedRunningTime="2026-03-17 01:37:32.990840596 +0000 UTC m=+4527.750292879" watchObservedRunningTime="2026-03-17 01:37:33.18602949 +0000 UTC m=+4527.945481773" Mar 17 01:37:33 crc kubenswrapper[4755]: I0317 01:37:33.205173 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:33 crc kubenswrapper[4755]: I0317 01:37:33.978712 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="manila-share" containerID="cri-o://bb173aa6069a33cbb40a3819af62c745a5830bb5b5fa9685ef94d526df303224" gracePeriod=30 Mar 17 01:37:33 crc kubenswrapper[4755]: I0317 01:37:33.978799 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="probe" containerID="cri-o://495aa7169366e270908a5d0c4a6d859318a550523be422c8555daa69b58b0cb0" gracePeriod=30 Mar 17 01:37:34 crc kubenswrapper[4755]: I0317 01:37:34.996196 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerID="495aa7169366e270908a5d0c4a6d859318a550523be422c8555daa69b58b0cb0" exitCode=0 Mar 17 01:37:34 crc kubenswrapper[4755]: I0317 01:37:34.996574 4755 generic.go:334] "Generic (PLEG): container finished" podID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerID="bb173aa6069a33cbb40a3819af62c745a5830bb5b5fa9685ef94d526df303224" exitCode=1 Mar 17 01:37:34 crc kubenswrapper[4755]: I0317 01:37:34.996295 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerDied","Data":"495aa7169366e270908a5d0c4a6d859318a550523be422c8555daa69b58b0cb0"} Mar 17 01:37:34 crc kubenswrapper[4755]: I0317 01:37:34.996634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerDied","Data":"bb173aa6069a33cbb40a3819af62c745a5830bb5b5fa9685ef94d526df303224"} Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.568005 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698701 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698869 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698885 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.698976 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.699000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.699037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzsg\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg\") pod \"ef43e859-ce33-425b-9a27-1a26d692e3f0\" (UID: \"ef43e859-ce33-425b-9a27-1a26d692e3f0\") " Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.699587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.700182 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.706779 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg" (OuterVolumeSpecName: "kube-api-access-cmzsg") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "kube-api-access-cmzsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.708791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts" (OuterVolumeSpecName: "scripts") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.708905 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph" (OuterVolumeSpecName: "ceph") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.718189 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.772611 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804361 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804408 4755 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-ceph\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804419 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804431 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804460 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ef43e859-ce33-425b-9a27-1a26d692e3f0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.804470 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzsg\" (UniqueName: \"kubernetes.io/projected/ef43e859-ce33-425b-9a27-1a26d692e3f0-kube-api-access-cmzsg\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.833594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data" (OuterVolumeSpecName: "config-data") pod "ef43e859-ce33-425b-9a27-1a26d692e3f0" (UID: "ef43e859-ce33-425b-9a27-1a26d692e3f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:35 crc kubenswrapper[4755]: I0317 01:37:35.907183 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef43e859-ce33-425b-9a27-1a26d692e3f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.006509 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ef43e859-ce33-425b-9a27-1a26d692e3f0","Type":"ContainerDied","Data":"37c42c270e89710e5f2c52f89f66647f19e0e5b6f6c40c6493b434219c1f4416"} Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.006563 4755 scope.go:117] "RemoveContainer" containerID="495aa7169366e270908a5d0c4a6d859318a550523be422c8555daa69b58b0cb0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.006576 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.037729 4755 scope.go:117] "RemoveContainer" containerID="bb173aa6069a33cbb40a3819af62c745a5830bb5b5fa9685ef94d526df303224" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.058501 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.079243 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.113573 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:36 crc kubenswrapper[4755]: E0317 01:37:36.113999 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="probe" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.114015 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="probe" Mar 17 01:37:36 crc kubenswrapper[4755]: E0317 01:37:36.114052 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="manila-share" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.114060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="manila-share" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.114278 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="manila-share" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.114300 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" containerName="probe" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.115816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.120948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.142340 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212227 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212857 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-scripts\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxvvv\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-kube-api-access-nxvvv\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.212987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.213019 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-ceph\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.213061 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.213260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.259365 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef43e859-ce33-425b-9a27-1a26d692e3f0" path="/var/lib/kubelet/pods/ef43e859-ce33-425b-9a27-1a26d692e3f0/volumes" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.315365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.315703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-scripts\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.315804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxvvv\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-kube-api-access-nxvvv\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.315909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-ceph\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316244 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316389 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.316840 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/788939c2-92b3-482c-8271-08204a569e10-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.319996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.320013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-scripts\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.320811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-ceph\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.320850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.326507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788939c2-92b3-482c-8271-08204a569e10-config-data\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.332466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxvvv\" (UniqueName: \"kubernetes.io/projected/788939c2-92b3-482c-8271-08204a569e10-kube-api-access-nxvvv\") pod \"manila-share-share1-0\" (UID: \"788939c2-92b3-482c-8271-08204a569e10\") " pod="openstack/manila-share-share1-0" Mar 17 01:37:36 crc kubenswrapper[4755]: I0317 01:37:36.456372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 17 01:37:37 crc kubenswrapper[4755]: I0317 01:37:37.001644 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.039782 4755 generic.go:334] "Generic (PLEG): container finished" podID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerID="4a701b4e9a067233672effaac585e80e05c9b1c0d3dd8d44dda4a23be5c25fa0" exitCode=137 Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.039997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerDied","Data":"4a701b4e9a067233672effaac585e80e05c9b1c0d3dd8d44dda4a23be5c25fa0"} Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.051896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"788939c2-92b3-482c-8271-08204a569e10","Type":"ContainerStarted","Data":"ea6a8ee76588e5b758936eb83e9fe82f2b4d18ab8189506bf7511362532fb95e"} Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.051938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"788939c2-92b3-482c-8271-08204a569e10","Type":"ContainerStarted","Data":"ee581de15423468d51cf27e6054ecd59d1d1f258ac812ad5849e8044dbd7422c"} Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.051948 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"788939c2-92b3-482c-8271-08204a569e10","Type":"ContainerStarted","Data":"7f502357236dbfd33debd3f99948ebf32070fbde082af6ef81d22012ba1e5a76"} Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.519476 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.554726 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.554706147 podStartE2EDuration="2.554706147s" podCreationTimestamp="2026-03-17 01:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 01:37:38.087450666 +0000 UTC m=+4532.846902949" watchObservedRunningTime="2026-03-17 01:37:38.554706147 +0000 UTC m=+4533.314158430" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690463 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690564 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690686 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690936 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchxx\" (UniqueName: \"kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx\") pod \"dae88496-2d38-4e87-bf99-c371e4af8c35\" (UID: \"dae88496-2d38-4e87-bf99-c371e4af8c35\") " Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.690977 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs" (OuterVolumeSpecName: "logs") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.691886 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae88496-2d38-4e87-bf99-c371e4af8c35-logs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.700336 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.714904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx" (OuterVolumeSpecName: "kube-api-access-kchxx") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "kube-api-access-kchxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.732638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.738612 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data" (OuterVolumeSpecName: "config-data") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.749131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts" (OuterVolumeSpecName: "scripts") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "dae88496-2d38-4e87-bf99-c371e4af8c35" (UID: "dae88496-2d38-4e87-bf99-c371e4af8c35"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794467 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794500 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794513 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kchxx\" (UniqueName: \"kubernetes.io/projected/dae88496-2d38-4e87-bf99-c371e4af8c35-kube-api-access-kchxx\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794523 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dae88496-2d38-4e87-bf99-c371e4af8c35-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794531 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:38 crc kubenswrapper[4755]: I0317 01:37:38.794542 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dae88496-2d38-4e87-bf99-c371e4af8c35-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.064293 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f6c96f776-ndhg5" Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.064424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f6c96f776-ndhg5" event={"ID":"dae88496-2d38-4e87-bf99-c371e4af8c35","Type":"ContainerDied","Data":"9b1e77e4948524b68dab667f5bf20f948bb0969db26ab9a367c7697dc35dce44"} Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.064475 4755 scope.go:117] "RemoveContainer" containerID="f7567997308408dbb2139de2595e00ba6ae6735c9cda2535a531582128a9487a" Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.119189 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.127574 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f6c96f776-ndhg5"] Mar 17 01:37:39 crc kubenswrapper[4755]: I0317 01:37:39.307255 4755 scope.go:117] "RemoveContainer" containerID="4a701b4e9a067233672effaac585e80e05c9b1c0d3dd8d44dda4a23be5c25fa0" Mar 17 01:37:40 crc kubenswrapper[4755]: I0317 01:37:40.266801 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" path="/var/lib/kubelet/pods/dae88496-2d38-4e87-bf99-c371e4af8c35/volumes" Mar 17 01:37:43 crc kubenswrapper[4755]: I0317 01:37:43.248084 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:37:43 crc kubenswrapper[4755]: E0317 01:37:43.248768 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:37:46 crc kubenswrapper[4755]: I0317 01:37:46.457496 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 17 01:37:47 crc kubenswrapper[4755]: I0317 01:37:47.655893 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 17 01:37:57 crc kubenswrapper[4755]: I0317 01:37:57.248608 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:37:57 crc kubenswrapper[4755]: E0317 01:37:57.249580 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:37:57 crc kubenswrapper[4755]: I0317 01:37:57.964093 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 17 01:37:58 crc kubenswrapper[4755]: I0317 01:37:58.418911 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.182838 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561858-kpcw7"] Mar 17 01:38:00 crc kubenswrapper[4755]: E0317 01:38:00.183449 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.183461 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" Mar 17 01:38:00 crc kubenswrapper[4755]: E0317 01:38:00.183479 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon-log" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.183485 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon-log" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.183700 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon-log" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.183721 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae88496-2d38-4e87-bf99-c371e4af8c35" containerName="horizon" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.184392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.192270 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.192489 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.192542 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.205793 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-kpcw7"] Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.264656 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppc8z\" (UniqueName: \"kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z\") pod \"auto-csr-approver-29561858-kpcw7\" (UID: \"17903a6c-1b58-4b2d-ab77-3c5ae172c26b\") " pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.366760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppc8z\" (UniqueName: \"kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z\") pod \"auto-csr-approver-29561858-kpcw7\" (UID: \"17903a6c-1b58-4b2d-ab77-3c5ae172c26b\") " pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.387999 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppc8z\" (UniqueName: \"kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z\") pod \"auto-csr-approver-29561858-kpcw7\" (UID: \"17903a6c-1b58-4b2d-ab77-3c5ae172c26b\") " pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:00 crc kubenswrapper[4755]: I0317 01:38:00.560894 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:01 crc kubenswrapper[4755]: I0317 01:38:01.062216 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-kpcw7"] Mar 17 01:38:01 crc kubenswrapper[4755]: W0317 01:38:01.063657 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17903a6c_1b58_4b2d_ab77_3c5ae172c26b.slice/crio-5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9 WatchSource:0}: Error finding container 5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9: Status 404 returned error can't find the container with id 5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9 Mar 17 01:38:01 crc kubenswrapper[4755]: I0317 01:38:01.363532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" event={"ID":"17903a6c-1b58-4b2d-ab77-3c5ae172c26b","Type":"ContainerStarted","Data":"5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9"} Mar 17 01:38:03 crc kubenswrapper[4755]: I0317 01:38:03.388847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" event={"ID":"17903a6c-1b58-4b2d-ab77-3c5ae172c26b","Type":"ContainerStarted","Data":"60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b"} Mar 17 01:38:03 crc kubenswrapper[4755]: I0317 01:38:03.421901 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" podStartSLOduration=2.105497007 podStartE2EDuration="3.421867845s" podCreationTimestamp="2026-03-17 01:38:00 +0000 UTC" firstStartedPulling="2026-03-17 01:38:01.065249914 +0000 UTC m=+4555.824702207" lastFinishedPulling="2026-03-17 01:38:02.381620732 +0000 UTC m=+4557.141073045" observedRunningTime="2026-03-17 01:38:03.407382733 +0000 UTC m=+4558.166835026" watchObservedRunningTime="2026-03-17 01:38:03.421867845 +0000 UTC m=+4558.181320168" Mar 17 01:38:04 crc kubenswrapper[4755]: E0317 01:38:04.233689 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17903a6c_1b58_4b2d_ab77_3c5ae172c26b.slice/crio-60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17903a6c_1b58_4b2d_ab77_3c5ae172c26b.slice/crio-conmon-60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:38:04 crc kubenswrapper[4755]: I0317 01:38:04.402403 4755 generic.go:334] "Generic (PLEG): container finished" podID="17903a6c-1b58-4b2d-ab77-3c5ae172c26b" containerID="60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b" exitCode=0 Mar 17 01:38:04 crc kubenswrapper[4755]: I0317 01:38:04.402463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" event={"ID":"17903a6c-1b58-4b2d-ab77-3c5ae172c26b","Type":"ContainerDied","Data":"60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b"} Mar 17 01:38:05 crc kubenswrapper[4755]: I0317 01:38:05.890210 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.004355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppc8z\" (UniqueName: \"kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z\") pod \"17903a6c-1b58-4b2d-ab77-3c5ae172c26b\" (UID: \"17903a6c-1b58-4b2d-ab77-3c5ae172c26b\") " Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.013710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z" (OuterVolumeSpecName: "kube-api-access-ppc8z") pod "17903a6c-1b58-4b2d-ab77-3c5ae172c26b" (UID: "17903a6c-1b58-4b2d-ab77-3c5ae172c26b"). InnerVolumeSpecName "kube-api-access-ppc8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.107367 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppc8z\" (UniqueName: \"kubernetes.io/projected/17903a6c-1b58-4b2d-ab77-3c5ae172c26b-kube-api-access-ppc8z\") on node \"crc\" DevicePath \"\"" Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.424357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" event={"ID":"17903a6c-1b58-4b2d-ab77-3c5ae172c26b","Type":"ContainerDied","Data":"5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9"} Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.424881 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afb8e605cb154e78c20769205fcad6f3a1e769c3739357eae503c54c1b662a9" Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.424952 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561858-kpcw7" Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.533988 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-gblns"] Mar 17 01:38:06 crc kubenswrapper[4755]: I0317 01:38:06.534228 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561852-gblns"] Mar 17 01:38:08 crc kubenswrapper[4755]: I0317 01:38:08.267974 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83be5c50-751a-4035-b4ed-5c146f257516" path="/var/lib/kubelet/pods/83be5c50-751a-4035-b4ed-5c146f257516/volumes" Mar 17 01:38:10 crc kubenswrapper[4755]: I0317 01:38:10.253757 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:38:10 crc kubenswrapper[4755]: E0317 01:38:10.254197 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:38:23 crc kubenswrapper[4755]: I0317 01:38:23.248411 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:38:23 crc kubenswrapper[4755]: E0317 01:38:23.249224 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:38:37 crc kubenswrapper[4755]: I0317 01:38:37.248677 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:38:37 crc kubenswrapper[4755]: E0317 01:38:37.249736 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:38:49 crc kubenswrapper[4755]: I0317 01:38:49.249475 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:38:49 crc kubenswrapper[4755]: E0317 01:38:49.250911 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:39:01 crc kubenswrapper[4755]: I0317 01:39:01.248613 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:39:01 crc kubenswrapper[4755]: E0317 01:39:01.249327 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:39:04 crc kubenswrapper[4755]: I0317 01:39:04.717587 4755 scope.go:117] "RemoveContainer" containerID="f4144484dabfd460b690cf4a3b17e7061ae53d51f87fd9d48e284331f7764a81" Mar 17 01:39:15 crc kubenswrapper[4755]: I0317 01:39:15.248866 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:39:15 crc kubenswrapper[4755]: E0317 01:39:15.249813 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:39:22 crc kubenswrapper[4755]: E0317 01:39:22.847084 4755 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:35560->38.102.83.32:36119: write tcp 38.102.83.32:35560->38.102.83.32:36119: write: broken pipe Mar 17 01:39:30 crc kubenswrapper[4755]: I0317 01:39:30.248869 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:39:30 crc kubenswrapper[4755]: E0317 01:39:30.250062 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:39:42 crc kubenswrapper[4755]: I0317 01:39:42.252603 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:39:42 crc kubenswrapper[4755]: E0317 01:39:42.253604 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:39:53 crc kubenswrapper[4755]: I0317 01:39:53.248019 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:39:53 crc kubenswrapper[4755]: E0317 01:39:53.250316 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.151821 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561860-5mppd"] Mar 17 01:40:00 crc kubenswrapper[4755]: E0317 01:40:00.152949 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17903a6c-1b58-4b2d-ab77-3c5ae172c26b" containerName="oc" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.152965 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="17903a6c-1b58-4b2d-ab77-3c5ae172c26b" containerName="oc" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.153226 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="17903a6c-1b58-4b2d-ab77-3c5ae172c26b" containerName="oc" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.154198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.155965 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.156361 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.156578 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.207539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-5mppd"] Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.255331 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvcpl\" (UniqueName: \"kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl\") pod \"auto-csr-approver-29561860-5mppd\" (UID: \"c907864f-5243-4f38-bc4d-7fa90f2b0c90\") " pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.358415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvcpl\" (UniqueName: \"kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl\") pod \"auto-csr-approver-29561860-5mppd\" (UID: \"c907864f-5243-4f38-bc4d-7fa90f2b0c90\") " pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.377592 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvcpl\" (UniqueName: \"kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl\") pod \"auto-csr-approver-29561860-5mppd\" (UID: \"c907864f-5243-4f38-bc4d-7fa90f2b0c90\") " pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:00 crc kubenswrapper[4755]: I0317 01:40:00.474621 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:01 crc kubenswrapper[4755]: I0317 01:40:01.037283 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-5mppd"] Mar 17 01:40:02 crc kubenswrapper[4755]: I0317 01:40:02.034965 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-5mppd" event={"ID":"c907864f-5243-4f38-bc4d-7fa90f2b0c90","Type":"ContainerStarted","Data":"d583efa483f6121c0c9936abaa1706dd998d3933faf0dc80146dac18bfea3067"} Mar 17 01:40:03 crc kubenswrapper[4755]: I0317 01:40:03.054590 4755 generic.go:334] "Generic (PLEG): container finished" podID="c907864f-5243-4f38-bc4d-7fa90f2b0c90" containerID="aedacdaa5ff458545bbe19845a59d46313dfca168f42d86e17501f239826d500" exitCode=0 Mar 17 01:40:03 crc kubenswrapper[4755]: I0317 01:40:03.054682 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-5mppd" event={"ID":"c907864f-5243-4f38-bc4d-7fa90f2b0c90","Type":"ContainerDied","Data":"aedacdaa5ff458545bbe19845a59d46313dfca168f42d86e17501f239826d500"} Mar 17 01:40:04 crc kubenswrapper[4755]: I0317 01:40:04.586161 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:04 crc kubenswrapper[4755]: I0317 01:40:04.663031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvcpl\" (UniqueName: \"kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl\") pod \"c907864f-5243-4f38-bc4d-7fa90f2b0c90\" (UID: \"c907864f-5243-4f38-bc4d-7fa90f2b0c90\") " Mar 17 01:40:04 crc kubenswrapper[4755]: I0317 01:40:04.679753 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl" (OuterVolumeSpecName: "kube-api-access-wvcpl") pod "c907864f-5243-4f38-bc4d-7fa90f2b0c90" (UID: "c907864f-5243-4f38-bc4d-7fa90f2b0c90"). InnerVolumeSpecName "kube-api-access-wvcpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:40:04 crc kubenswrapper[4755]: I0317 01:40:04.767192 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvcpl\" (UniqueName: \"kubernetes.io/projected/c907864f-5243-4f38-bc4d-7fa90f2b0c90-kube-api-access-wvcpl\") on node \"crc\" DevicePath \"\"" Mar 17 01:40:05 crc kubenswrapper[4755]: I0317 01:40:05.097071 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561860-5mppd" event={"ID":"c907864f-5243-4f38-bc4d-7fa90f2b0c90","Type":"ContainerDied","Data":"d583efa483f6121c0c9936abaa1706dd998d3933faf0dc80146dac18bfea3067"} Mar 17 01:40:05 crc kubenswrapper[4755]: I0317 01:40:05.097619 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d583efa483f6121c0c9936abaa1706dd998d3933faf0dc80146dac18bfea3067" Mar 17 01:40:05 crc kubenswrapper[4755]: I0317 01:40:05.097668 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561860-5mppd" Mar 17 01:40:05 crc kubenswrapper[4755]: I0317 01:40:05.690059 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-7mc2k"] Mar 17 01:40:05 crc kubenswrapper[4755]: I0317 01:40:05.703797 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561854-7mc2k"] Mar 17 01:40:06 crc kubenswrapper[4755]: I0317 01:40:06.274690 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee0ca2d-e10a-4a3f-8732-efc1a21240c2" path="/var/lib/kubelet/pods/8ee0ca2d-e10a-4a3f-8732-efc1a21240c2/volumes" Mar 17 01:40:08 crc kubenswrapper[4755]: I0317 01:40:08.248474 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:40:08 crc kubenswrapper[4755]: E0317 01:40:08.249227 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:40:19 crc kubenswrapper[4755]: I0317 01:40:19.248806 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:40:19 crc kubenswrapper[4755]: E0317 01:40:19.249922 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:40:33 crc kubenswrapper[4755]: I0317 01:40:33.249683 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:40:33 crc kubenswrapper[4755]: E0317 01:40:33.251126 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:40:47 crc kubenswrapper[4755]: I0317 01:40:47.248597 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:40:47 crc kubenswrapper[4755]: E0317 01:40:47.249745 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:41:02 crc kubenswrapper[4755]: I0317 01:41:02.248939 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:41:02 crc kubenswrapper[4755]: E0317 01:41:02.249971 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:41:04 crc kubenswrapper[4755]: I0317 01:41:04.832134 4755 scope.go:117] "RemoveContainer" containerID="dc443d4a35bd5f9a12b3efec3e4e927fc3944b1a434ecce8ed556b20e890f38f" Mar 17 01:41:16 crc kubenswrapper[4755]: I0317 01:41:16.255589 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:41:16 crc kubenswrapper[4755]: E0317 01:41:16.256270 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:41:29 crc kubenswrapper[4755]: I0317 01:41:29.248912 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:41:29 crc kubenswrapper[4755]: E0317 01:41:29.250019 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:41:43 crc kubenswrapper[4755]: I0317 01:41:43.248195 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:41:43 crc kubenswrapper[4755]: E0317 01:41:43.248974 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:41:54 crc kubenswrapper[4755]: I0317 01:41:54.249578 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:41:54 crc kubenswrapper[4755]: E0317 01:41:54.250850 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.153768 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561862-rljpb"] Mar 17 01:42:00 crc kubenswrapper[4755]: E0317 01:42:00.154745 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c907864f-5243-4f38-bc4d-7fa90f2b0c90" containerName="oc" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.154762 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c907864f-5243-4f38-bc4d-7fa90f2b0c90" containerName="oc" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.155024 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c907864f-5243-4f38-bc4d-7fa90f2b0c90" containerName="oc" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.156083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.159209 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.168418 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-rljpb"] Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.170102 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.170136 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.232839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl46f\" (UniqueName: \"kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f\") pod \"auto-csr-approver-29561862-rljpb\" (UID: \"a4746f29-64b1-47a5-8ead-17cb5d4f55b0\") " pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.335896 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl46f\" (UniqueName: \"kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f\") pod \"auto-csr-approver-29561862-rljpb\" (UID: \"a4746f29-64b1-47a5-8ead-17cb5d4f55b0\") " pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.360555 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl46f\" (UniqueName: \"kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f\") pod \"auto-csr-approver-29561862-rljpb\" (UID: \"a4746f29-64b1-47a5-8ead-17cb5d4f55b0\") " pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.507173 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:00 crc kubenswrapper[4755]: I0317 01:42:00.992376 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:42:01 crc kubenswrapper[4755]: I0317 01:42:01.001215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-rljpb"] Mar 17 01:42:01 crc kubenswrapper[4755]: I0317 01:42:01.672761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-rljpb" event={"ID":"a4746f29-64b1-47a5-8ead-17cb5d4f55b0","Type":"ContainerStarted","Data":"0570020fa47edd9db9a49b3358ab01354c14562989dcfa4da09ac9d90a14caad"} Mar 17 01:42:03 crc kubenswrapper[4755]: I0317 01:42:03.700135 4755 generic.go:334] "Generic (PLEG): container finished" podID="a4746f29-64b1-47a5-8ead-17cb5d4f55b0" containerID="633e64f1153eecc7ab733975566a7c12163c8e37ccff3827f0fa43e7d2984338" exitCode=0 Mar 17 01:42:03 crc kubenswrapper[4755]: I0317 01:42:03.700208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-rljpb" event={"ID":"a4746f29-64b1-47a5-8ead-17cb5d4f55b0","Type":"ContainerDied","Data":"633e64f1153eecc7ab733975566a7c12163c8e37ccff3827f0fa43e7d2984338"} Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.229690 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.376258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl46f\" (UniqueName: \"kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f\") pod \"a4746f29-64b1-47a5-8ead-17cb5d4f55b0\" (UID: \"a4746f29-64b1-47a5-8ead-17cb5d4f55b0\") " Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.382843 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f" (OuterVolumeSpecName: "kube-api-access-fl46f") pod "a4746f29-64b1-47a5-8ead-17cb5d4f55b0" (UID: "a4746f29-64b1-47a5-8ead-17cb5d4f55b0"). InnerVolumeSpecName "kube-api-access-fl46f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.480867 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl46f\" (UniqueName: \"kubernetes.io/projected/a4746f29-64b1-47a5-8ead-17cb5d4f55b0-kube-api-access-fl46f\") on node \"crc\" DevicePath \"\"" Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.724992 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561862-rljpb" event={"ID":"a4746f29-64b1-47a5-8ead-17cb5d4f55b0","Type":"ContainerDied","Data":"0570020fa47edd9db9a49b3358ab01354c14562989dcfa4da09ac9d90a14caad"} Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.725043 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0570020fa47edd9db9a49b3358ab01354c14562989dcfa4da09ac9d90a14caad" Mar 17 01:42:05 crc kubenswrapper[4755]: I0317 01:42:05.725079 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561862-rljpb" Mar 17 01:42:06 crc kubenswrapper[4755]: I0317 01:42:06.259663 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:42:06 crc kubenswrapper[4755]: E0317 01:42:06.260367 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:42:06 crc kubenswrapper[4755]: I0317 01:42:06.320506 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-5w7ms"] Mar 17 01:42:06 crc kubenswrapper[4755]: I0317 01:42:06.334344 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561856-5w7ms"] Mar 17 01:42:08 crc kubenswrapper[4755]: I0317 01:42:08.259933 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f097fb2-0d82-4452-9e6a-46cd03d069bb" path="/var/lib/kubelet/pods/4f097fb2-0d82-4452-9e6a-46cd03d069bb/volumes" Mar 17 01:42:19 crc kubenswrapper[4755]: I0317 01:42:19.248337 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:42:19 crc kubenswrapper[4755]: E0317 01:42:19.249280 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:42:31 crc kubenswrapper[4755]: I0317 01:42:31.249755 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:42:32 crc kubenswrapper[4755]: I0317 01:42:32.031579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce"} Mar 17 01:42:59 crc kubenswrapper[4755]: E0317 01:42:59.511932 4755 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.32:55188->38.102.83.32:36119: read tcp 38.102.83.32:55188->38.102.83.32:36119: read: connection reset by peer Mar 17 01:43:01 crc kubenswrapper[4755]: I0317 01:43:01.935970 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:01 crc kubenswrapper[4755]: E0317 01:43:01.937021 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4746f29-64b1-47a5-8ead-17cb5d4f55b0" containerName="oc" Mar 17 01:43:01 crc kubenswrapper[4755]: I0317 01:43:01.937043 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4746f29-64b1-47a5-8ead-17cb5d4f55b0" containerName="oc" Mar 17 01:43:01 crc kubenswrapper[4755]: I0317 01:43:01.937523 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4746f29-64b1-47a5-8ead-17cb5d4f55b0" containerName="oc" Mar 17 01:43:01 crc kubenswrapper[4755]: I0317 01:43:01.940368 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:01 crc kubenswrapper[4755]: I0317 01:43:01.952715 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.088627 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.088719 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6j9s\" (UniqueName: \"kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.088924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.191525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.191775 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.191828 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6j9s\" (UniqueName: \"kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.192139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.192139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.214776 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6j9s\" (UniqueName: \"kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s\") pod \"certified-operators-nvnqj\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.282416 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:02 crc kubenswrapper[4755]: I0317 01:43:02.812205 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:03 crc kubenswrapper[4755]: E0317 01:43:03.373768 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5027c68_db54_4811_8d3a_a12d44ae919d.slice/crio-conmon-54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:43:03 crc kubenswrapper[4755]: I0317 01:43:03.431601 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerID="54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2" exitCode=0 Mar 17 01:43:03 crc kubenswrapper[4755]: I0317 01:43:03.431642 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerDied","Data":"54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2"} Mar 17 01:43:03 crc kubenswrapper[4755]: I0317 01:43:03.431665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerStarted","Data":"f2842d991a6caa16c1da5e1f8636734fac384be66a2c47b3c2bc2c7be56c5152"} Mar 17 01:43:05 crc kubenswrapper[4755]: I0317 01:43:05.010595 4755 scope.go:117] "RemoveContainer" containerID="93feb718bc2e208adff5beef43647bd2ea7467fbf7df708c8a41691fda5fa6ff" Mar 17 01:43:05 crc kubenswrapper[4755]: I0317 01:43:05.053712 4755 scope.go:117] "RemoveContainer" containerID="b74ccbcfe4adc39d55afad1bcb90a8c76c16fefdf742c912be5a554ae56ccfc3" Mar 17 01:43:05 crc kubenswrapper[4755]: I0317 01:43:05.110529 4755 scope.go:117] "RemoveContainer" containerID="1d6814c4d8fa6b73686c0468199839bd46390b5bd665417f155c9489484a4c96" Mar 17 01:43:05 crc kubenswrapper[4755]: I0317 01:43:05.460672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerStarted","Data":"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7"} Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.132404 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.135024 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.142069 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.319076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdjs\" (UniqueName: \"kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.319157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.319520 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.421191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdjs\" (UniqueName: \"kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.421411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.421495 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.422057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.422282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.443903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdjs\" (UniqueName: \"kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs\") pod \"redhat-marketplace-x22ff\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.475240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.480131 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerID="57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7" exitCode=0 Mar 17 01:43:07 crc kubenswrapper[4755]: I0317 01:43:07.480250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerDied","Data":"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7"} Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.143511 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:08 crc kubenswrapper[4755]: W0317 01:43:08.166633 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod762d7367_3674_4afb_96f9_abe4ce8aadd9.slice/crio-283bf20c6f47378070cbb8bd1ff5d1bb2cda7fc3d535d0faac56f2d15ffef25b WatchSource:0}: Error finding container 283bf20c6f47378070cbb8bd1ff5d1bb2cda7fc3d535d0faac56f2d15ffef25b: Status 404 returned error can't find the container with id 283bf20c6f47378070cbb8bd1ff5d1bb2cda7fc3d535d0faac56f2d15ffef25b Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.497761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerStarted","Data":"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83"} Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.501077 4755 generic.go:334] "Generic (PLEG): container finished" podID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerID="0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b" exitCode=0 Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.501113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerDied","Data":"0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b"} Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.501133 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerStarted","Data":"283bf20c6f47378070cbb8bd1ff5d1bb2cda7fc3d535d0faac56f2d15ffef25b"} Mar 17 01:43:08 crc kubenswrapper[4755]: I0317 01:43:08.518739 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvnqj" podStartSLOduration=2.984136371 podStartE2EDuration="7.518720133s" podCreationTimestamp="2026-03-17 01:43:01 +0000 UTC" firstStartedPulling="2026-03-17 01:43:03.435096549 +0000 UTC m=+4858.194548832" lastFinishedPulling="2026-03-17 01:43:07.969680311 +0000 UTC m=+4862.729132594" observedRunningTime="2026-03-17 01:43:08.513208723 +0000 UTC m=+4863.272661016" watchObservedRunningTime="2026-03-17 01:43:08.518720133 +0000 UTC m=+4863.278172416" Mar 17 01:43:09 crc kubenswrapper[4755]: I0317 01:43:09.513559 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerStarted","Data":"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f"} Mar 17 01:43:11 crc kubenswrapper[4755]: I0317 01:43:11.544981 4755 generic.go:334] "Generic (PLEG): container finished" podID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerID="2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f" exitCode=0 Mar 17 01:43:11 crc kubenswrapper[4755]: I0317 01:43:11.545055 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerDied","Data":"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f"} Mar 17 01:43:12 crc kubenswrapper[4755]: I0317 01:43:12.282899 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:12 crc kubenswrapper[4755]: I0317 01:43:12.282980 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:13 crc kubenswrapper[4755]: I0317 01:43:13.442149 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nvnqj" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="registry-server" probeResult="failure" output=< Mar 17 01:43:13 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:43:13 crc kubenswrapper[4755]: > Mar 17 01:43:13 crc kubenswrapper[4755]: I0317 01:43:13.566243 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerStarted","Data":"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c"} Mar 17 01:43:13 crc kubenswrapper[4755]: I0317 01:43:13.597754 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x22ff" podStartSLOduration=3.140315945 podStartE2EDuration="6.597734251s" podCreationTimestamp="2026-03-17 01:43:07 +0000 UTC" firstStartedPulling="2026-03-17 01:43:08.502585815 +0000 UTC m=+4863.262038108" lastFinishedPulling="2026-03-17 01:43:11.960004121 +0000 UTC m=+4866.719456414" observedRunningTime="2026-03-17 01:43:13.592433239 +0000 UTC m=+4868.351885532" watchObservedRunningTime="2026-03-17 01:43:13.597734251 +0000 UTC m=+4868.357186544" Mar 17 01:43:17 crc kubenswrapper[4755]: I0317 01:43:17.476596 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:17 crc kubenswrapper[4755]: I0317 01:43:17.477120 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:17 crc kubenswrapper[4755]: I0317 01:43:17.550536 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:17 crc kubenswrapper[4755]: I0317 01:43:17.691361 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:18 crc kubenswrapper[4755]: I0317 01:43:18.931833 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:19 crc kubenswrapper[4755]: I0317 01:43:19.646156 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x22ff" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="registry-server" containerID="cri-o://700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c" gracePeriod=2 Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.277999 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.317191 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content\") pod \"762d7367-3674-4afb-96f9-abe4ce8aadd9\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.317277 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khdjs\" (UniqueName: \"kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs\") pod \"762d7367-3674-4afb-96f9-abe4ce8aadd9\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.317314 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities\") pod \"762d7367-3674-4afb-96f9-abe4ce8aadd9\" (UID: \"762d7367-3674-4afb-96f9-abe4ce8aadd9\") " Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.327278 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities" (OuterVolumeSpecName: "utilities") pod "762d7367-3674-4afb-96f9-abe4ce8aadd9" (UID: "762d7367-3674-4afb-96f9-abe4ce8aadd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.329983 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs" (OuterVolumeSpecName: "kube-api-access-khdjs") pod "762d7367-3674-4afb-96f9-abe4ce8aadd9" (UID: "762d7367-3674-4afb-96f9-abe4ce8aadd9"). InnerVolumeSpecName "kube-api-access-khdjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.362283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "762d7367-3674-4afb-96f9-abe4ce8aadd9" (UID: "762d7367-3674-4afb-96f9-abe4ce8aadd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.420712 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.420749 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khdjs\" (UniqueName: \"kubernetes.io/projected/762d7367-3674-4afb-96f9-abe4ce8aadd9-kube-api-access-khdjs\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.420763 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/762d7367-3674-4afb-96f9-abe4ce8aadd9-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.664843 4755 generic.go:334] "Generic (PLEG): container finished" podID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerID="700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c" exitCode=0 Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.664902 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerDied","Data":"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c"} Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.664943 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x22ff" event={"ID":"762d7367-3674-4afb-96f9-abe4ce8aadd9","Type":"ContainerDied","Data":"283bf20c6f47378070cbb8bd1ff5d1bb2cda7fc3d535d0faac56f2d15ffef25b"} Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.664974 4755 scope.go:117] "RemoveContainer" containerID="700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.665176 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x22ff" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.707717 4755 scope.go:117] "RemoveContainer" containerID="2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.712868 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.728894 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x22ff"] Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.897746 4755 scope.go:117] "RemoveContainer" containerID="0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.932953 4755 scope.go:117] "RemoveContainer" containerID="700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c" Mar 17 01:43:20 crc kubenswrapper[4755]: E0317 01:43:20.934535 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c\": container with ID starting with 700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c not found: ID does not exist" containerID="700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.934603 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c"} err="failed to get container status \"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c\": rpc error: code = NotFound desc = could not find container \"700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c\": container with ID starting with 700f8f95f3aef1b953335929cd0cf7458242d791fe82be33ff7847d30ea0bc6c not found: ID does not exist" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.934645 4755 scope.go:117] "RemoveContainer" containerID="2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f" Mar 17 01:43:20 crc kubenswrapper[4755]: E0317 01:43:20.935856 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f\": container with ID starting with 2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f not found: ID does not exist" containerID="2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.935889 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f"} err="failed to get container status \"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f\": rpc error: code = NotFound desc = could not find container \"2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f\": container with ID starting with 2259113a2fca41f7b9a12210688378fba0aa06551efb486012eafbcfaeb21e4f not found: ID does not exist" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.935909 4755 scope.go:117] "RemoveContainer" containerID="0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b" Mar 17 01:43:20 crc kubenswrapper[4755]: E0317 01:43:20.936362 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b\": container with ID starting with 0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b not found: ID does not exist" containerID="0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b" Mar 17 01:43:20 crc kubenswrapper[4755]: I0317 01:43:20.936551 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b"} err="failed to get container status \"0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b\": rpc error: code = NotFound desc = could not find container \"0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b\": container with ID starting with 0f9ff33e69e2c23437d7ec670a41ea784011a4349653cae533a82ee251c48b7b not found: ID does not exist" Mar 17 01:43:22 crc kubenswrapper[4755]: I0317 01:43:22.264959 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" path="/var/lib/kubelet/pods/762d7367-3674-4afb-96f9-abe4ce8aadd9/volumes" Mar 17 01:43:22 crc kubenswrapper[4755]: I0317 01:43:22.345269 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:22 crc kubenswrapper[4755]: I0317 01:43:22.414778 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:23 crc kubenswrapper[4755]: I0317 01:43:23.319129 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:23 crc kubenswrapper[4755]: I0317 01:43:23.705471 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvnqj" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="registry-server" containerID="cri-o://3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83" gracePeriod=2 Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.249141 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.332321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6j9s\" (UniqueName: \"kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s\") pod \"d5027c68-db54-4811-8d3a-a12d44ae919d\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.332519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities\") pod \"d5027c68-db54-4811-8d3a-a12d44ae919d\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.334798 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities" (OuterVolumeSpecName: "utilities") pod "d5027c68-db54-4811-8d3a-a12d44ae919d" (UID: "d5027c68-db54-4811-8d3a-a12d44ae919d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.352936 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s" (OuterVolumeSpecName: "kube-api-access-s6j9s") pod "d5027c68-db54-4811-8d3a-a12d44ae919d" (UID: "d5027c68-db54-4811-8d3a-a12d44ae919d"). InnerVolumeSpecName "kube-api-access-s6j9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.434294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content\") pod \"d5027c68-db54-4811-8d3a-a12d44ae919d\" (UID: \"d5027c68-db54-4811-8d3a-a12d44ae919d\") " Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.435410 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6j9s\" (UniqueName: \"kubernetes.io/projected/d5027c68-db54-4811-8d3a-a12d44ae919d-kube-api-access-s6j9s\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.435450 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.509052 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5027c68-db54-4811-8d3a-a12d44ae919d" (UID: "d5027c68-db54-4811-8d3a-a12d44ae919d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.537997 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5027c68-db54-4811-8d3a-a12d44ae919d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.718716 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerID="3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83" exitCode=0 Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.718759 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerDied","Data":"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83"} Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.718790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvnqj" event={"ID":"d5027c68-db54-4811-8d3a-a12d44ae919d","Type":"ContainerDied","Data":"f2842d991a6caa16c1da5e1f8636734fac384be66a2c47b3c2bc2c7be56c5152"} Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.718807 4755 scope.go:117] "RemoveContainer" containerID="3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.718828 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvnqj" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.748962 4755 scope.go:117] "RemoveContainer" containerID="57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.761498 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.774072 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvnqj"] Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.779981 4755 scope.go:117] "RemoveContainer" containerID="54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.830833 4755 scope.go:117] "RemoveContainer" containerID="3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83" Mar 17 01:43:24 crc kubenswrapper[4755]: E0317 01:43:24.831338 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83\": container with ID starting with 3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83 not found: ID does not exist" containerID="3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.831383 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83"} err="failed to get container status \"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83\": rpc error: code = NotFound desc = could not find container \"3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83\": container with ID starting with 3fa2d6ca1de51d9b4a86114ea6e2f9dcea81febe86ac94a19256f9fa45e3cc83 not found: ID does not exist" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.831409 4755 scope.go:117] "RemoveContainer" containerID="57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7" Mar 17 01:43:24 crc kubenswrapper[4755]: E0317 01:43:24.831774 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7\": container with ID starting with 57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7 not found: ID does not exist" containerID="57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.831812 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7"} err="failed to get container status \"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7\": rpc error: code = NotFound desc = could not find container \"57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7\": container with ID starting with 57b1d88f69f29449ddbccc2f933e242afe4fe0718a53961a154884e57dcf50b7 not found: ID does not exist" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.831844 4755 scope.go:117] "RemoveContainer" containerID="54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2" Mar 17 01:43:24 crc kubenswrapper[4755]: E0317 01:43:24.832075 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2\": container with ID starting with 54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2 not found: ID does not exist" containerID="54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2" Mar 17 01:43:24 crc kubenswrapper[4755]: I0317 01:43:24.832096 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2"} err="failed to get container status \"54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2\": rpc error: code = NotFound desc = could not find container \"54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2\": container with ID starting with 54f3a6f2bd81fdfe9765c614caa4b330c840f2ac5b09a217feaa5147d9116de2 not found: ID does not exist" Mar 17 01:43:26 crc kubenswrapper[4755]: I0317 01:43:26.270652 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" path="/var/lib/kubelet/pods/d5027c68-db54-4811-8d3a-a12d44ae919d/volumes" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.172476 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561864-z4jdz"] Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173457 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="extract-content" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173473 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="extract-content" Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173496 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173504 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173526 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="extract-utilities" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173535 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="extract-utilities" Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173559 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="extract-utilities" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173567 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="extract-utilities" Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173576 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173584 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: E0317 01:44:00.173599 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="extract-content" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173607 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="extract-content" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173892 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5027c68-db54-4811-8d3a-a12d44ae919d" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.173932 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="762d7367-3674-4afb-96f9-abe4ce8aadd9" containerName="registry-server" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.174890 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.178855 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.183856 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.183974 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.188314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-z4jdz"] Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.224614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-486rq\" (UniqueName: \"kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq\") pod \"auto-csr-approver-29561864-z4jdz\" (UID: \"02bbdb7c-d390-4c1a-bd25-169285b5ae31\") " pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.327600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-486rq\" (UniqueName: \"kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq\") pod \"auto-csr-approver-29561864-z4jdz\" (UID: \"02bbdb7c-d390-4c1a-bd25-169285b5ae31\") " pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.360365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-486rq\" (UniqueName: \"kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq\") pod \"auto-csr-approver-29561864-z4jdz\" (UID: \"02bbdb7c-d390-4c1a-bd25-169285b5ae31\") " pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:00 crc kubenswrapper[4755]: I0317 01:44:00.540612 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:01 crc kubenswrapper[4755]: I0317 01:44:01.073265 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-z4jdz"] Mar 17 01:44:01 crc kubenswrapper[4755]: W0317 01:44:01.076077 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02bbdb7c_d390_4c1a_bd25_169285b5ae31.slice/crio-1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293 WatchSource:0}: Error finding container 1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293: Status 404 returned error can't find the container with id 1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293 Mar 17 01:44:01 crc kubenswrapper[4755]: I0317 01:44:01.187275 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" event={"ID":"02bbdb7c-d390-4c1a-bd25-169285b5ae31","Type":"ContainerStarted","Data":"1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293"} Mar 17 01:44:03 crc kubenswrapper[4755]: I0317 01:44:03.225615 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" event={"ID":"02bbdb7c-d390-4c1a-bd25-169285b5ae31","Type":"ContainerStarted","Data":"3d942e7889522bf5e7b5a7a4ada3c4fc7d1406bbc29737bb9a096b6955d31c2f"} Mar 17 01:44:03 crc kubenswrapper[4755]: I0317 01:44:03.263122 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" podStartSLOduration=2.344730843 podStartE2EDuration="3.263075875s" podCreationTimestamp="2026-03-17 01:44:00 +0000 UTC" firstStartedPulling="2026-03-17 01:44:01.083432684 +0000 UTC m=+4915.842884997" lastFinishedPulling="2026-03-17 01:44:02.001777706 +0000 UTC m=+4916.761230029" observedRunningTime="2026-03-17 01:44:03.240700638 +0000 UTC m=+4918.000152961" watchObservedRunningTime="2026-03-17 01:44:03.263075875 +0000 UTC m=+4918.022528198" Mar 17 01:44:04 crc kubenswrapper[4755]: I0317 01:44:04.246672 4755 generic.go:334] "Generic (PLEG): container finished" podID="02bbdb7c-d390-4c1a-bd25-169285b5ae31" containerID="3d942e7889522bf5e7b5a7a4ada3c4fc7d1406bbc29737bb9a096b6955d31c2f" exitCode=0 Mar 17 01:44:04 crc kubenswrapper[4755]: I0317 01:44:04.246805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" event={"ID":"02bbdb7c-d390-4c1a-bd25-169285b5ae31","Type":"ContainerDied","Data":"3d942e7889522bf5e7b5a7a4ada3c4fc7d1406bbc29737bb9a096b6955d31c2f"} Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.291510 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" event={"ID":"02bbdb7c-d390-4c1a-bd25-169285b5ae31","Type":"ContainerDied","Data":"1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293"} Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.292494 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f2ac1b7de37f2c0ce61d81113a5d16fcf285b0bde8ea3753edd605efd41a293" Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.406525 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.512812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-486rq\" (UniqueName: \"kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq\") pod \"02bbdb7c-d390-4c1a-bd25-169285b5ae31\" (UID: \"02bbdb7c-d390-4c1a-bd25-169285b5ae31\") " Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.523921 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq" (OuterVolumeSpecName: "kube-api-access-486rq") pod "02bbdb7c-d390-4c1a-bd25-169285b5ae31" (UID: "02bbdb7c-d390-4c1a-bd25-169285b5ae31"). InnerVolumeSpecName "kube-api-access-486rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:44:06 crc kubenswrapper[4755]: I0317 01:44:06.618137 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-486rq\" (UniqueName: \"kubernetes.io/projected/02bbdb7c-d390-4c1a-bd25-169285b5ae31-kube-api-access-486rq\") on node \"crc\" DevicePath \"\"" Mar 17 01:44:07 crc kubenswrapper[4755]: I0317 01:44:07.305526 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561864-z4jdz" Mar 17 01:44:07 crc kubenswrapper[4755]: I0317 01:44:07.515705 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-kpcw7"] Mar 17 01:44:07 crc kubenswrapper[4755]: I0317 01:44:07.527700 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561858-kpcw7"] Mar 17 01:44:08 crc kubenswrapper[4755]: I0317 01:44:08.274786 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17903a6c-1b58-4b2d-ab77-3c5ae172c26b" path="/var/lib/kubelet/pods/17903a6c-1b58-4b2d-ab77-3c5ae172c26b/volumes" Mar 17 01:44:58 crc kubenswrapper[4755]: I0317 01:44:58.664825 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:44:58 crc kubenswrapper[4755]: I0317 01:44:58.665330 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.179875 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59"] Mar 17 01:45:00 crc kubenswrapper[4755]: E0317 01:45:00.180995 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bbdb7c-d390-4c1a-bd25-169285b5ae31" containerName="oc" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.181018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bbdb7c-d390-4c1a-bd25-169285b5ae31" containerName="oc" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.181446 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bbdb7c-d390-4c1a-bd25-169285b5ae31" containerName="oc" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.182860 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.187001 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.187288 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.196563 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59"] Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.367836 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5btd\" (UniqueName: \"kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.367937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.367993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.470344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5btd\" (UniqueName: \"kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.470466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.470520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.472722 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.481084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.491508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5btd\" (UniqueName: \"kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd\") pod \"collect-profiles-29561865-mqc59\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:00 crc kubenswrapper[4755]: I0317 01:45:00.531137 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:01 crc kubenswrapper[4755]: I0317 01:45:01.065106 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59"] Mar 17 01:45:02 crc kubenswrapper[4755]: I0317 01:45:02.013723 4755 generic.go:334] "Generic (PLEG): container finished" podID="d6a87505-6ac8-497a-bae1-f8f2664446e7" containerID="675be24cd45c9b39d84cf1f2208e6d4aa7f81eead3f2550b9d32a08f6247f736" exitCode=0 Mar 17 01:45:02 crc kubenswrapper[4755]: I0317 01:45:02.013942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" event={"ID":"d6a87505-6ac8-497a-bae1-f8f2664446e7","Type":"ContainerDied","Data":"675be24cd45c9b39d84cf1f2208e6d4aa7f81eead3f2550b9d32a08f6247f736"} Mar 17 01:45:02 crc kubenswrapper[4755]: I0317 01:45:02.015212 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" event={"ID":"d6a87505-6ac8-497a-bae1-f8f2664446e7","Type":"ContainerStarted","Data":"a41345cfb2a7452eb3e2d1b817cc0c4e1a7b25d9d2721455232b88e8a9112cee"} Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.408413 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.537325 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume\") pod \"d6a87505-6ac8-497a-bae1-f8f2664446e7\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.537412 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume\") pod \"d6a87505-6ac8-497a-bae1-f8f2664446e7\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.537536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5btd\" (UniqueName: \"kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd\") pod \"d6a87505-6ac8-497a-bae1-f8f2664446e7\" (UID: \"d6a87505-6ac8-497a-bae1-f8f2664446e7\") " Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.538082 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6a87505-6ac8-497a-bae1-f8f2664446e7" (UID: "d6a87505-6ac8-497a-bae1-f8f2664446e7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.538342 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6a87505-6ac8-497a-bae1-f8f2664446e7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.544655 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6a87505-6ac8-497a-bae1-f8f2664446e7" (UID: "d6a87505-6ac8-497a-bae1-f8f2664446e7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.545217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd" (OuterVolumeSpecName: "kube-api-access-m5btd") pod "d6a87505-6ac8-497a-bae1-f8f2664446e7" (UID: "d6a87505-6ac8-497a-bae1-f8f2664446e7"). InnerVolumeSpecName "kube-api-access-m5btd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.642732 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6a87505-6ac8-497a-bae1-f8f2664446e7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:03 crc kubenswrapper[4755]: I0317 01:45:03.642884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5btd\" (UniqueName: \"kubernetes.io/projected/d6a87505-6ac8-497a-bae1-f8f2664446e7-kube-api-access-m5btd\") on node \"crc\" DevicePath \"\"" Mar 17 01:45:04 crc kubenswrapper[4755]: I0317 01:45:04.040095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" event={"ID":"d6a87505-6ac8-497a-bae1-f8f2664446e7","Type":"ContainerDied","Data":"a41345cfb2a7452eb3e2d1b817cc0c4e1a7b25d9d2721455232b88e8a9112cee"} Mar 17 01:45:04 crc kubenswrapper[4755]: I0317 01:45:04.040149 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a41345cfb2a7452eb3e2d1b817cc0c4e1a7b25d9d2721455232b88e8a9112cee" Mar 17 01:45:04 crc kubenswrapper[4755]: I0317 01:45:04.040151 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59" Mar 17 01:45:04 crc kubenswrapper[4755]: I0317 01:45:04.514258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf"] Mar 17 01:45:04 crc kubenswrapper[4755]: I0317 01:45:04.527621 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561820-4z5nf"] Mar 17 01:45:05 crc kubenswrapper[4755]: I0317 01:45:05.344664 4755 scope.go:117] "RemoveContainer" containerID="60c3933a90556e3307c8a1f1acd4d7820347b7e8e19e3aafba2ba80840fdb46b" Mar 17 01:45:06 crc kubenswrapper[4755]: I0317 01:45:06.274621 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36aaeb7-f107-43d2-9e43-11378467f808" path="/var/lib/kubelet/pods/f36aaeb7-f107-43d2-9e43-11378467f808/volumes" Mar 17 01:45:28 crc kubenswrapper[4755]: I0317 01:45:28.665603 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:45:28 crc kubenswrapper[4755]: I0317 01:45:28.666524 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:45:58 crc kubenswrapper[4755]: I0317 01:45:58.665375 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:45:58 crc kubenswrapper[4755]: I0317 01:45:58.665967 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:45:58 crc kubenswrapper[4755]: I0317 01:45:58.666016 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:45:58 crc kubenswrapper[4755]: I0317 01:45:58.666976 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:45:58 crc kubenswrapper[4755]: I0317 01:45:58.667036 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce" gracePeriod=600 Mar 17 01:45:58 crc kubenswrapper[4755]: E0317 01:45:58.826911 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de863ac_0be1_45c8_9e03_56aa0fe9a23d.slice/crio-ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce.scope\": RecentStats: unable to find data in memory cache]" Mar 17 01:45:59 crc kubenswrapper[4755]: I0317 01:45:59.780379 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce" exitCode=0 Mar 17 01:45:59 crc kubenswrapper[4755]: I0317 01:45:59.780461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce"} Mar 17 01:45:59 crc kubenswrapper[4755]: I0317 01:45:59.780938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e"} Mar 17 01:45:59 crc kubenswrapper[4755]: I0317 01:45:59.780965 4755 scope.go:117] "RemoveContainer" containerID="ed0c20cadf7c2028108ab7f550910eb6fb30dd5f1cde340ade3c0b629650ca5f" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.155437 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561866-ldb7h"] Mar 17 01:46:00 crc kubenswrapper[4755]: E0317 01:46:00.156016 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a87505-6ac8-497a-bae1-f8f2664446e7" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.156041 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a87505-6ac8-497a-bae1-f8f2664446e7" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.156292 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a87505-6ac8-497a-bae1-f8f2664446e7" containerName="collect-profiles" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.157208 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.160712 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.160935 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.161041 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.188609 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-ldb7h"] Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.287354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzjv\" (UniqueName: \"kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv\") pod \"auto-csr-approver-29561866-ldb7h\" (UID: \"62a3566f-07ce-4705-b6db-9ab9f2789ffb\") " pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.389973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzjv\" (UniqueName: \"kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv\") pod \"auto-csr-approver-29561866-ldb7h\" (UID: \"62a3566f-07ce-4705-b6db-9ab9f2789ffb\") " pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.416339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzjv\" (UniqueName: \"kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv\") pod \"auto-csr-approver-29561866-ldb7h\" (UID: \"62a3566f-07ce-4705-b6db-9ab9f2789ffb\") " pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.492297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:00 crc kubenswrapper[4755]: I0317 01:46:00.978110 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-ldb7h"] Mar 17 01:46:01 crc kubenswrapper[4755]: I0317 01:46:01.818779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" event={"ID":"62a3566f-07ce-4705-b6db-9ab9f2789ffb","Type":"ContainerStarted","Data":"8f1439246ff715d9b61c6620b5b538d2d284a4e460ffa3f053a2c8dab3f99ca3"} Mar 17 01:46:02 crc kubenswrapper[4755]: I0317 01:46:02.830127 4755 generic.go:334] "Generic (PLEG): container finished" podID="62a3566f-07ce-4705-b6db-9ab9f2789ffb" containerID="e1a5a3bc2fcc01369c0deebeb6144e3051676157175db9241ba4b5b6155194db" exitCode=0 Mar 17 01:46:02 crc kubenswrapper[4755]: I0317 01:46:02.830174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" event={"ID":"62a3566f-07ce-4705-b6db-9ab9f2789ffb","Type":"ContainerDied","Data":"e1a5a3bc2fcc01369c0deebeb6144e3051676157175db9241ba4b5b6155194db"} Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.273171 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.393198 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzjv\" (UniqueName: \"kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv\") pod \"62a3566f-07ce-4705-b6db-9ab9f2789ffb\" (UID: \"62a3566f-07ce-4705-b6db-9ab9f2789ffb\") " Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.402177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv" (OuterVolumeSpecName: "kube-api-access-jwzjv") pod "62a3566f-07ce-4705-b6db-9ab9f2789ffb" (UID: "62a3566f-07ce-4705-b6db-9ab9f2789ffb"). InnerVolumeSpecName "kube-api-access-jwzjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.497847 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzjv\" (UniqueName: \"kubernetes.io/projected/62a3566f-07ce-4705-b6db-9ab9f2789ffb-kube-api-access-jwzjv\") on node \"crc\" DevicePath \"\"" Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.859508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" event={"ID":"62a3566f-07ce-4705-b6db-9ab9f2789ffb","Type":"ContainerDied","Data":"8f1439246ff715d9b61c6620b5b538d2d284a4e460ffa3f053a2c8dab3f99ca3"} Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.859844 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f1439246ff715d9b61c6620b5b538d2d284a4e460ffa3f053a2c8dab3f99ca3" Mar 17 01:46:04 crc kubenswrapper[4755]: I0317 01:46:04.859549 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561866-ldb7h" Mar 17 01:46:05 crc kubenswrapper[4755]: I0317 01:46:05.362740 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-5mppd"] Mar 17 01:46:05 crc kubenswrapper[4755]: I0317 01:46:05.378430 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561860-5mppd"] Mar 17 01:46:05 crc kubenswrapper[4755]: I0317 01:46:05.457991 4755 scope.go:117] "RemoveContainer" containerID="ad9810ca0a1786673aacf6309ef48c0e2ec3687c8e8272f23c3fe3081066f72f" Mar 17 01:46:06 crc kubenswrapper[4755]: I0317 01:46:06.263257 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c907864f-5243-4f38-bc4d-7fa90f2b0c90" path="/var/lib/kubelet/pods/c907864f-5243-4f38-bc4d-7fa90f2b0c90/volumes" Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.087058 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wd6gn"] Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.103359 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wd6gn"] Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.122934 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-faeb-account-create-update-45nfl"] Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.138101 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-faeb-account-create-update-45nfl"] Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.260866 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9" path="/var/lib/kubelet/pods/34b74f5d-f565-4d19-b8c5-3f77e5b4eaa9/volumes" Mar 17 01:46:34 crc kubenswrapper[4755]: I0317 01:46:34.262327 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7388243-204f-4d8a-b842-2529524f0568" path="/var/lib/kubelet/pods/a7388243-204f-4d8a-b842-2529524f0568/volumes" Mar 17 01:47:01 crc kubenswrapper[4755]: I0317 01:47:01.057277 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-dtxkj"] Mar 17 01:47:01 crc kubenswrapper[4755]: I0317 01:47:01.078929 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-dtxkj"] Mar 17 01:47:02 crc kubenswrapper[4755]: I0317 01:47:02.270550 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb170d1-6e83-49fc-925f-6020490e5da7" path="/var/lib/kubelet/pods/7bb170d1-6e83-49fc-925f-6020490e5da7/volumes" Mar 17 01:47:05 crc kubenswrapper[4755]: I0317 01:47:05.529390 4755 scope.go:117] "RemoveContainer" containerID="aedacdaa5ff458545bbe19845a59d46313dfca168f42d86e17501f239826d500" Mar 17 01:47:05 crc kubenswrapper[4755]: I0317 01:47:05.592659 4755 scope.go:117] "RemoveContainer" containerID="e64456aa893a1a9200cbc8bb4329e0f5b0a256b466916d065f8c9637b7055a8c" Mar 17 01:47:05 crc kubenswrapper[4755]: I0317 01:47:05.662858 4755 scope.go:117] "RemoveContainer" containerID="7e470ef9b637bb108c7817113529c16ebddba08426e47e1a06e43335b2112e45" Mar 17 01:47:05 crc kubenswrapper[4755]: I0317 01:47:05.708202 4755 scope.go:117] "RemoveContainer" containerID="0b7986ed18dc2e729618ad4ce64178a3cdc6195ef52f6981d2a3f708c8f2f30d" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.166517 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:07 crc kubenswrapper[4755]: E0317 01:47:07.167952 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a3566f-07ce-4705-b6db-9ab9f2789ffb" containerName="oc" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.167973 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a3566f-07ce-4705-b6db-9ab9f2789ffb" containerName="oc" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.168358 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a3566f-07ce-4705-b6db-9ab9f2789ffb" containerName="oc" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.170534 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.185106 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.291187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthjs\" (UniqueName: \"kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.291253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.291269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.393847 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthjs\" (UniqueName: \"kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.393933 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.393959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.394615 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.394702 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.421890 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthjs\" (UniqueName: \"kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs\") pod \"community-operators-s2x9j\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:07 crc kubenswrapper[4755]: I0317 01:47:07.516325 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:08 crc kubenswrapper[4755]: I0317 01:47:08.038184 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:08 crc kubenswrapper[4755]: I0317 01:47:08.656380 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerStarted","Data":"c34e4e5d0aed59fcd0533b55aeb935a59008ea99289f317a491c2d8f0aa12b83"} Mar 17 01:47:09 crc kubenswrapper[4755]: I0317 01:47:09.680832 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerID="05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7" exitCode=0 Mar 17 01:47:09 crc kubenswrapper[4755]: I0317 01:47:09.682593 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerDied","Data":"05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7"} Mar 17 01:47:09 crc kubenswrapper[4755]: I0317 01:47:09.685830 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:47:10 crc kubenswrapper[4755]: I0317 01:47:10.692704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerStarted","Data":"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e"} Mar 17 01:47:12 crc kubenswrapper[4755]: I0317 01:47:12.734035 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerID="cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e" exitCode=0 Mar 17 01:47:12 crc kubenswrapper[4755]: I0317 01:47:12.734117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerDied","Data":"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e"} Mar 17 01:47:13 crc kubenswrapper[4755]: I0317 01:47:13.750806 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerStarted","Data":"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec"} Mar 17 01:47:13 crc kubenswrapper[4755]: I0317 01:47:13.785162 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2x9j" podStartSLOduration=3.306471141 podStartE2EDuration="6.785140223s" podCreationTimestamp="2026-03-17 01:47:07 +0000 UTC" firstStartedPulling="2026-03-17 01:47:09.685284334 +0000 UTC m=+5104.444736657" lastFinishedPulling="2026-03-17 01:47:13.163953456 +0000 UTC m=+5107.923405739" observedRunningTime="2026-03-17 01:47:13.775171483 +0000 UTC m=+5108.534623806" watchObservedRunningTime="2026-03-17 01:47:13.785140223 +0000 UTC m=+5108.544592516" Mar 17 01:47:17 crc kubenswrapper[4755]: I0317 01:47:17.517094 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:17 crc kubenswrapper[4755]: I0317 01:47:17.517762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:17 crc kubenswrapper[4755]: I0317 01:47:17.603574 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:27 crc kubenswrapper[4755]: I0317 01:47:27.586150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:27 crc kubenswrapper[4755]: I0317 01:47:27.661062 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:27 crc kubenswrapper[4755]: I0317 01:47:27.950565 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2x9j" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="registry-server" containerID="cri-o://941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec" gracePeriod=2 Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.549324 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.644047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities\") pod \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.644328 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content\") pod \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.644624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gthjs\" (UniqueName: \"kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs\") pod \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\" (UID: \"a6c2b2b2-bca1-448f-8104-94e9ae11c178\") " Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.645581 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities" (OuterVolumeSpecName: "utilities") pod "a6c2b2b2-bca1-448f-8104-94e9ae11c178" (UID: "a6c2b2b2-bca1-448f-8104-94e9ae11c178"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.661829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs" (OuterVolumeSpecName: "kube-api-access-gthjs") pod "a6c2b2b2-bca1-448f-8104-94e9ae11c178" (UID: "a6c2b2b2-bca1-448f-8104-94e9ae11c178"). InnerVolumeSpecName "kube-api-access-gthjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.690196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6c2b2b2-bca1-448f-8104-94e9ae11c178" (UID: "a6c2b2b2-bca1-448f-8104-94e9ae11c178"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.747810 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gthjs\" (UniqueName: \"kubernetes.io/projected/a6c2b2b2-bca1-448f-8104-94e9ae11c178-kube-api-access-gthjs\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.747853 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.747866 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6c2b2b2-bca1-448f-8104-94e9ae11c178-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.974329 4755 generic.go:334] "Generic (PLEG): container finished" podID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerID="941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec" exitCode=0 Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.974392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerDied","Data":"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec"} Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.974466 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x9j" Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.974484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x9j" event={"ID":"a6c2b2b2-bca1-448f-8104-94e9ae11c178","Type":"ContainerDied","Data":"c34e4e5d0aed59fcd0533b55aeb935a59008ea99289f317a491c2d8f0aa12b83"} Mar 17 01:47:28 crc kubenswrapper[4755]: I0317 01:47:28.974525 4755 scope.go:117] "RemoveContainer" containerID="941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.004313 4755 scope.go:117] "RemoveContainer" containerID="cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.029528 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.041633 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2x9j"] Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.059692 4755 scope.go:117] "RemoveContainer" containerID="05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.112454 4755 scope.go:117] "RemoveContainer" containerID="941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec" Mar 17 01:47:29 crc kubenswrapper[4755]: E0317 01:47:29.112945 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec\": container with ID starting with 941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec not found: ID does not exist" containerID="941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.113030 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec"} err="failed to get container status \"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec\": rpc error: code = NotFound desc = could not find container \"941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec\": container with ID starting with 941a9276082458e6f6074b2b8d5705b9c46c4da7a0e0785359a7b0b605d713ec not found: ID does not exist" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.113065 4755 scope.go:117] "RemoveContainer" containerID="cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e" Mar 17 01:47:29 crc kubenswrapper[4755]: E0317 01:47:29.113483 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e\": container with ID starting with cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e not found: ID does not exist" containerID="cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.113523 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e"} err="failed to get container status \"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e\": rpc error: code = NotFound desc = could not find container \"cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e\": container with ID starting with cbd3d0c4413ca395e333e6bf84b5611e59a1675d2f52a7a3c98152665044094e not found: ID does not exist" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.113553 4755 scope.go:117] "RemoveContainer" containerID="05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7" Mar 17 01:47:29 crc kubenswrapper[4755]: E0317 01:47:29.113923 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7\": container with ID starting with 05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7 not found: ID does not exist" containerID="05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7" Mar 17 01:47:29 crc kubenswrapper[4755]: I0317 01:47:29.113956 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7"} err="failed to get container status \"05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7\": rpc error: code = NotFound desc = could not find container \"05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7\": container with ID starting with 05baaf0f69772ecaec48204a799494e91ea9e6f99d00aac8bc1c97c6ad05b1e7 not found: ID does not exist" Mar 17 01:47:30 crc kubenswrapper[4755]: I0317 01:47:30.267348 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" path="/var/lib/kubelet/pods/a6c2b2b2-bca1-448f-8104-94e9ae11c178/volumes" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.151555 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561868-mprkf"] Mar 17 01:48:00 crc kubenswrapper[4755]: E0317 01:48:00.153727 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="extract-content" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.153742 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="extract-content" Mar 17 01:48:00 crc kubenswrapper[4755]: E0317 01:48:00.153756 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="extract-utilities" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.153763 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="extract-utilities" Mar 17 01:48:00 crc kubenswrapper[4755]: E0317 01:48:00.153788 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.153793 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.153993 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c2b2b2-bca1-448f-8104-94e9ae11c178" containerName="registry-server" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.154762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.156767 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.156803 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.159023 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.164508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-mprkf"] Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.277560 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scq77\" (UniqueName: \"kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77\") pod \"auto-csr-approver-29561868-mprkf\" (UID: \"247d94b9-dd90-4df1-8e89-3f211bdf29d1\") " pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.380518 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scq77\" (UniqueName: \"kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77\") pod \"auto-csr-approver-29561868-mprkf\" (UID: \"247d94b9-dd90-4df1-8e89-3f211bdf29d1\") " pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.656042 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scq77\" (UniqueName: \"kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77\") pod \"auto-csr-approver-29561868-mprkf\" (UID: \"247d94b9-dd90-4df1-8e89-3f211bdf29d1\") " pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:00 crc kubenswrapper[4755]: I0317 01:48:00.806744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:01 crc kubenswrapper[4755]: I0317 01:48:01.371660 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-mprkf"] Mar 17 01:48:02 crc kubenswrapper[4755]: I0317 01:48:02.389701 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-mprkf" event={"ID":"247d94b9-dd90-4df1-8e89-3f211bdf29d1","Type":"ContainerStarted","Data":"dc8092b253eb26a1f8398908e50b49937a4e38f41767d5720d585eae5d682ac8"} Mar 17 01:48:04 crc kubenswrapper[4755]: I0317 01:48:04.418427 4755 generic.go:334] "Generic (PLEG): container finished" podID="247d94b9-dd90-4df1-8e89-3f211bdf29d1" containerID="f2483a137e8106127227f3f4fb7631b22e0329ae555df4d2689bb6e6c48c0539" exitCode=0 Mar 17 01:48:04 crc kubenswrapper[4755]: I0317 01:48:04.418737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-mprkf" event={"ID":"247d94b9-dd90-4df1-8e89-3f211bdf29d1","Type":"ContainerDied","Data":"f2483a137e8106127227f3f4fb7631b22e0329ae555df4d2689bb6e6c48c0539"} Mar 17 01:48:05 crc kubenswrapper[4755]: I0317 01:48:05.975634 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.118392 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scq77\" (UniqueName: \"kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77\") pod \"247d94b9-dd90-4df1-8e89-3f211bdf29d1\" (UID: \"247d94b9-dd90-4df1-8e89-3f211bdf29d1\") " Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.125917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77" (OuterVolumeSpecName: "kube-api-access-scq77") pod "247d94b9-dd90-4df1-8e89-3f211bdf29d1" (UID: "247d94b9-dd90-4df1-8e89-3f211bdf29d1"). InnerVolumeSpecName "kube-api-access-scq77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.221190 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scq77\" (UniqueName: \"kubernetes.io/projected/247d94b9-dd90-4df1-8e89-3f211bdf29d1-kube-api-access-scq77\") on node \"crc\" DevicePath \"\"" Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.446295 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561868-mprkf" event={"ID":"247d94b9-dd90-4df1-8e89-3f211bdf29d1","Type":"ContainerDied","Data":"dc8092b253eb26a1f8398908e50b49937a4e38f41767d5720d585eae5d682ac8"} Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.446335 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8092b253eb26a1f8398908e50b49937a4e38f41767d5720d585eae5d682ac8" Mar 17 01:48:06 crc kubenswrapper[4755]: I0317 01:48:06.446693 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561868-mprkf" Mar 17 01:48:07 crc kubenswrapper[4755]: I0317 01:48:07.058756 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-rljpb"] Mar 17 01:48:07 crc kubenswrapper[4755]: I0317 01:48:07.067862 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561862-rljpb"] Mar 17 01:48:08 crc kubenswrapper[4755]: I0317 01:48:08.263732 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4746f29-64b1-47a5-8ead-17cb5d4f55b0" path="/var/lib/kubelet/pods/a4746f29-64b1-47a5-8ead-17cb5d4f55b0/volumes" Mar 17 01:48:28 crc kubenswrapper[4755]: I0317 01:48:28.665041 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:48:28 crc kubenswrapper[4755]: I0317 01:48:28.665465 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:48:58 crc kubenswrapper[4755]: I0317 01:48:58.664797 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:48:58 crc kubenswrapper[4755]: I0317 01:48:58.665429 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:49:05 crc kubenswrapper[4755]: I0317 01:49:05.914099 4755 scope.go:117] "RemoveContainer" containerID="633e64f1153eecc7ab733975566a7c12163c8e37ccff3827f0fa43e7d2984338" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.186420 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:49:28 crc kubenswrapper[4755]: E0317 01:49:28.187505 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247d94b9-dd90-4df1-8e89-3f211bdf29d1" containerName="oc" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.187521 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="247d94b9-dd90-4df1-8e89-3f211bdf29d1" containerName="oc" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.187778 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="247d94b9-dd90-4df1-8e89-3f211bdf29d1" containerName="oc" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.189768 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.214674 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.277427 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.277570 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.277624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwgq\" (UniqueName: \"kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.379031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.379151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.379204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwgq\" (UniqueName: \"kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.379577 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.379852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.410469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwgq\" (UniqueName: \"kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq\") pod \"redhat-operators-6jm7b\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.519301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.665725 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.666093 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.666139 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.667484 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:49:28 crc kubenswrapper[4755]: I0317 01:49:28.667626 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" gracePeriod=600 Mar 17 01:49:28 crc kubenswrapper[4755]: E0317 01:49:28.805243 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.018495 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.738491 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerID="45eb68e8802a470ae5c4c75b89a3b835d0cfd5b8dd4d9279d0d230bde59f853c" exitCode=0 Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.738812 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerDied","Data":"45eb68e8802a470ae5c4c75b89a3b835d0cfd5b8dd4d9279d0d230bde59f853c"} Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.738844 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerStarted","Data":"c48c51aa44688bfa062933bed53f045d4f07eaeb9db6ba1f7e2055cf1b272c3b"} Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.743801 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" exitCode=0 Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.743849 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e"} Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.743889 4755 scope.go:117] "RemoveContainer" containerID="ab7b0ae9369930a3ae4164ad4094be94b9e9ad2c85e4fb29456fd88028ab29ce" Mar 17 01:49:29 crc kubenswrapper[4755]: I0317 01:49:29.744944 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:49:29 crc kubenswrapper[4755]: E0317 01:49:29.745339 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:49:30 crc kubenswrapper[4755]: I0317 01:49:30.764751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerStarted","Data":"94e753d0c50060ace9739e223fb334d797d121228e4ac390b0d7296279662a95"} Mar 17 01:49:36 crc kubenswrapper[4755]: I0317 01:49:36.833751 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerID="94e753d0c50060ace9739e223fb334d797d121228e4ac390b0d7296279662a95" exitCode=0 Mar 17 01:49:36 crc kubenswrapper[4755]: I0317 01:49:36.834020 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerDied","Data":"94e753d0c50060ace9739e223fb334d797d121228e4ac390b0d7296279662a95"} Mar 17 01:49:37 crc kubenswrapper[4755]: I0317 01:49:37.846332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerStarted","Data":"6a8e251308733f25cb27bca7b1755249bdc017fd5a29bc71efde4de205ca94c1"} Mar 17 01:49:37 crc kubenswrapper[4755]: I0317 01:49:37.882725 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6jm7b" podStartSLOduration=2.353767688 podStartE2EDuration="9.882704346s" podCreationTimestamp="2026-03-17 01:49:28 +0000 UTC" firstStartedPulling="2026-03-17 01:49:29.740646629 +0000 UTC m=+5244.500098922" lastFinishedPulling="2026-03-17 01:49:37.269583267 +0000 UTC m=+5252.029035580" observedRunningTime="2026-03-17 01:49:37.873024773 +0000 UTC m=+5252.632477076" watchObservedRunningTime="2026-03-17 01:49:37.882704346 +0000 UTC m=+5252.642156629" Mar 17 01:49:38 crc kubenswrapper[4755]: I0317 01:49:38.520142 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:38 crc kubenswrapper[4755]: I0317 01:49:38.520218 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:39 crc kubenswrapper[4755]: I0317 01:49:39.591901 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6jm7b" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:39 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:39 crc kubenswrapper[4755]: > Mar 17 01:49:45 crc kubenswrapper[4755]: I0317 01:49:45.248342 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:49:45 crc kubenswrapper[4755]: E0317 01:49:45.249110 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:49:49 crc kubenswrapper[4755]: I0317 01:49:49.566767 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6jm7b" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" probeResult="failure" output=< Mar 17 01:49:49 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:49:49 crc kubenswrapper[4755]: > Mar 17 01:49:56 crc kubenswrapper[4755]: I0317 01:49:56.258099 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:49:56 crc kubenswrapper[4755]: E0317 01:49:56.259084 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:49:58 crc kubenswrapper[4755]: I0317 01:49:58.568627 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:59 crc kubenswrapper[4755]: I0317 01:49:59.319912 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:49:59 crc kubenswrapper[4755]: I0317 01:49:59.391723 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.134797 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6jm7b" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" containerID="cri-o://6a8e251308733f25cb27bca7b1755249bdc017fd5a29bc71efde4de205ca94c1" gracePeriod=2 Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.184528 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561870-hkl6h"] Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.185842 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.192351 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.192728 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.194319 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.240984 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-hkl6h"] Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.371782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz7n\" (UniqueName: \"kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n\") pod \"auto-csr-approver-29561870-hkl6h\" (UID: \"704de771-367b-49f1-aa0a-8ecd53f733ea\") " pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.474650 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz7n\" (UniqueName: \"kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n\") pod \"auto-csr-approver-29561870-hkl6h\" (UID: \"704de771-367b-49f1-aa0a-8ecd53f733ea\") " pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.497067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz7n\" (UniqueName: \"kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n\") pod \"auto-csr-approver-29561870-hkl6h\" (UID: \"704de771-367b-49f1-aa0a-8ecd53f733ea\") " pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:00 crc kubenswrapper[4755]: I0317 01:50:00.687757 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.149887 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerID="6a8e251308733f25cb27bca7b1755249bdc017fd5a29bc71efde4de205ca94c1" exitCode=0 Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.149950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerDied","Data":"6a8e251308733f25cb27bca7b1755249bdc017fd5a29bc71efde4de205ca94c1"} Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.547182 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.700753 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dwgq\" (UniqueName: \"kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq\") pod \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.701428 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities\") pod \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.701571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content\") pod \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\" (UID: \"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e\") " Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.702854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities" (OuterVolumeSpecName: "utilities") pod "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" (UID: "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.706361 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq" (OuterVolumeSpecName: "kube-api-access-6dwgq") pod "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" (UID: "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e"). InnerVolumeSpecName "kube-api-access-6dwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.803679 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dwgq\" (UniqueName: \"kubernetes.io/projected/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-kube-api-access-6dwgq\") on node \"crc\" DevicePath \"\"" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.803711 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.837491 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-hkl6h"] Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.847068 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" (UID: "0e3786b0-ad93-4f62-b6a2-4b0c2baf039e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:50:01 crc kubenswrapper[4755]: I0317 01:50:01.906028 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.170777 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jm7b" event={"ID":"0e3786b0-ad93-4f62-b6a2-4b0c2baf039e","Type":"ContainerDied","Data":"c48c51aa44688bfa062933bed53f045d4f07eaeb9db6ba1f7e2055cf1b272c3b"} Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.171193 4755 scope.go:117] "RemoveContainer" containerID="6a8e251308733f25cb27bca7b1755249bdc017fd5a29bc71efde4de205ca94c1" Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.171477 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jm7b" Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.182684 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" event={"ID":"704de771-367b-49f1-aa0a-8ecd53f733ea","Type":"ContainerStarted","Data":"bb23fd1c2661a6f6ed98dccbfda7d8174eedbe1868ef7063d439e8e856c9efd3"} Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.218898 4755 scope.go:117] "RemoveContainer" containerID="94e753d0c50060ace9739e223fb334d797d121228e4ac390b0d7296279662a95" Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.257691 4755 scope.go:117] "RemoveContainer" containerID="45eb68e8802a470ae5c4c75b89a3b835d0cfd5b8dd4d9279d0d230bde59f853c" Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.269560 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:50:02 crc kubenswrapper[4755]: I0317 01:50:02.269614 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6jm7b"] Mar 17 01:50:04 crc kubenswrapper[4755]: I0317 01:50:04.217580 4755 generic.go:334] "Generic (PLEG): container finished" podID="704de771-367b-49f1-aa0a-8ecd53f733ea" containerID="f1c0db0a41a037daf6dd508dc9b495f4cc1a753008d0eda39c431f3308455f39" exitCode=0 Mar 17 01:50:04 crc kubenswrapper[4755]: I0317 01:50:04.218277 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" event={"ID":"704de771-367b-49f1-aa0a-8ecd53f733ea","Type":"ContainerDied","Data":"f1c0db0a41a037daf6dd508dc9b495f4cc1a753008d0eda39c431f3308455f39"} Mar 17 01:50:04 crc kubenswrapper[4755]: I0317 01:50:04.279836 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" path="/var/lib/kubelet/pods/0e3786b0-ad93-4f62-b6a2-4b0c2baf039e/volumes" Mar 17 01:50:05 crc kubenswrapper[4755]: I0317 01:50:05.791492 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:05 crc kubenswrapper[4755]: I0317 01:50:05.908388 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbz7n\" (UniqueName: \"kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n\") pod \"704de771-367b-49f1-aa0a-8ecd53f733ea\" (UID: \"704de771-367b-49f1-aa0a-8ecd53f733ea\") " Mar 17 01:50:05 crc kubenswrapper[4755]: I0317 01:50:05.915314 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n" (OuterVolumeSpecName: "kube-api-access-jbz7n") pod "704de771-367b-49f1-aa0a-8ecd53f733ea" (UID: "704de771-367b-49f1-aa0a-8ecd53f733ea"). InnerVolumeSpecName "kube-api-access-jbz7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.012374 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbz7n\" (UniqueName: \"kubernetes.io/projected/704de771-367b-49f1-aa0a-8ecd53f733ea-kube-api-access-jbz7n\") on node \"crc\" DevicePath \"\"" Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.249770 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.269064 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561870-hkl6h" event={"ID":"704de771-367b-49f1-aa0a-8ecd53f733ea","Type":"ContainerDied","Data":"bb23fd1c2661a6f6ed98dccbfda7d8174eedbe1868ef7063d439e8e856c9efd3"} Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.269175 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb23fd1c2661a6f6ed98dccbfda7d8174eedbe1868ef7063d439e8e856c9efd3" Mar 17 01:50:06 crc kubenswrapper[4755]: E0317 01:50:06.347901 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.878550 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-z4jdz"] Mar 17 01:50:06 crc kubenswrapper[4755]: I0317 01:50:06.926450 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561864-z4jdz"] Mar 17 01:50:08 crc kubenswrapper[4755]: I0317 01:50:08.262401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bbdb7c-d390-4c1a-bd25-169285b5ae31" path="/var/lib/kubelet/pods/02bbdb7c-d390-4c1a-bd25-169285b5ae31/volumes" Mar 17 01:50:09 crc kubenswrapper[4755]: I0317 01:50:09.248233 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:50:09 crc kubenswrapper[4755]: E0317 01:50:09.248954 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:50:16 crc kubenswrapper[4755]: E0317 01:50:16.643226 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:50:24 crc kubenswrapper[4755]: I0317 01:50:24.249278 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:50:24 crc kubenswrapper[4755]: E0317 01:50:24.250587 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:50:27 crc kubenswrapper[4755]: E0317 01:50:27.001101 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:50:37 crc kubenswrapper[4755]: E0317 01:50:37.282565 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:50:38 crc kubenswrapper[4755]: I0317 01:50:38.248509 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:50:38 crc kubenswrapper[4755]: E0317 01:50:38.249329 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:50:47 crc kubenswrapper[4755]: E0317 01:50:47.549440 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:50:53 crc kubenswrapper[4755]: I0317 01:50:53.248227 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:50:53 crc kubenswrapper[4755]: E0317 01:50:53.249149 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:50:57 crc kubenswrapper[4755]: E0317 01:50:57.845423 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704de771_367b_49f1_aa0a_8ecd53f733ea.slice\": RecentStats: unable to find data in memory cache]" Mar 17 01:51:06 crc kubenswrapper[4755]: I0317 01:51:06.084417 4755 scope.go:117] "RemoveContainer" containerID="3d942e7889522bf5e7b5a7a4ada3c4fc7d1406bbc29737bb9a096b6955d31c2f" Mar 17 01:51:08 crc kubenswrapper[4755]: I0317 01:51:08.248347 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:51:08 crc kubenswrapper[4755]: E0317 01:51:08.249938 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:51:21 crc kubenswrapper[4755]: I0317 01:51:21.248001 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:51:21 crc kubenswrapper[4755]: E0317 01:51:21.249077 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:51:36 crc kubenswrapper[4755]: I0317 01:51:36.270933 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:51:36 crc kubenswrapper[4755]: E0317 01:51:36.273491 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:51:50 crc kubenswrapper[4755]: I0317 01:51:50.249010 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:51:50 crc kubenswrapper[4755]: E0317 01:51:50.251823 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.154321 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561872-k85xm"] Mar 17 01:52:00 crc kubenswrapper[4755]: E0317 01:52:00.155344 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704de771-367b-49f1-aa0a-8ecd53f733ea" containerName="oc" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155357 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="704de771-367b-49f1-aa0a-8ecd53f733ea" containerName="oc" Mar 17 01:52:00 crc kubenswrapper[4755]: E0317 01:52:00.155389 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="extract-utilities" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155400 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="extract-utilities" Mar 17 01:52:00 crc kubenswrapper[4755]: E0317 01:52:00.155422 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="extract-content" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155432 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="extract-content" Mar 17 01:52:00 crc kubenswrapper[4755]: E0317 01:52:00.155475 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155484 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155747 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="704de771-367b-49f1-aa0a-8ecd53f733ea" containerName="oc" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.155764 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3786b0-ad93-4f62-b6a2-4b0c2baf039e" containerName="registry-server" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.156678 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.165978 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-k85xm"] Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.187091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkdr\" (UniqueName: \"kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr\") pod \"auto-csr-approver-29561872-k85xm\" (UID: \"c3a09072-1522-4650-ada2-0b45e9f44984\") " pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.190044 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.190194 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.190254 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.288734 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkdr\" (UniqueName: \"kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr\") pod \"auto-csr-approver-29561872-k85xm\" (UID: \"c3a09072-1522-4650-ada2-0b45e9f44984\") " pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.305318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkdr\" (UniqueName: \"kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr\") pod \"auto-csr-approver-29561872-k85xm\" (UID: \"c3a09072-1522-4650-ada2-0b45e9f44984\") " pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:00 crc kubenswrapper[4755]: I0317 01:52:00.500983 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:01 crc kubenswrapper[4755]: I0317 01:52:01.011610 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-k85xm"] Mar 17 01:52:01 crc kubenswrapper[4755]: I0317 01:52:01.700331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-k85xm" event={"ID":"c3a09072-1522-4650-ada2-0b45e9f44984","Type":"ContainerStarted","Data":"702a67418012962012b5771afce10080c0f1663dde8618bec200b6eb5ec2384d"} Mar 17 01:52:02 crc kubenswrapper[4755]: I0317 01:52:02.721550 4755 generic.go:334] "Generic (PLEG): container finished" podID="c3a09072-1522-4650-ada2-0b45e9f44984" containerID="7e5c18dc59e4a612bf95b599ab70e29e103feeb1015883235ed5b6f44109125d" exitCode=0 Mar 17 01:52:02 crc kubenswrapper[4755]: I0317 01:52:02.721679 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-k85xm" event={"ID":"c3a09072-1522-4650-ada2-0b45e9f44984","Type":"ContainerDied","Data":"7e5c18dc59e4a612bf95b599ab70e29e103feeb1015883235ed5b6f44109125d"} Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.172359 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.249239 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:52:04 crc kubenswrapper[4755]: E0317 01:52:04.249506 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.287595 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkdr\" (UniqueName: \"kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr\") pod \"c3a09072-1522-4650-ada2-0b45e9f44984\" (UID: \"c3a09072-1522-4650-ada2-0b45e9f44984\") " Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.309335 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr" (OuterVolumeSpecName: "kube-api-access-xkkdr") pod "c3a09072-1522-4650-ada2-0b45e9f44984" (UID: "c3a09072-1522-4650-ada2-0b45e9f44984"). InnerVolumeSpecName "kube-api-access-xkkdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.390723 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkdr\" (UniqueName: \"kubernetes.io/projected/c3a09072-1522-4650-ada2-0b45e9f44984-kube-api-access-xkkdr\") on node \"crc\" DevicePath \"\"" Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.752665 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561872-k85xm" event={"ID":"c3a09072-1522-4650-ada2-0b45e9f44984","Type":"ContainerDied","Data":"702a67418012962012b5771afce10080c0f1663dde8618bec200b6eb5ec2384d"} Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.752744 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="702a67418012962012b5771afce10080c0f1663dde8618bec200b6eb5ec2384d" Mar 17 01:52:04 crc kubenswrapper[4755]: I0317 01:52:04.752766 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561872-k85xm" Mar 17 01:52:05 crc kubenswrapper[4755]: I0317 01:52:05.276890 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-ldb7h"] Mar 17 01:52:05 crc kubenswrapper[4755]: I0317 01:52:05.294597 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561866-ldb7h"] Mar 17 01:52:06 crc kubenswrapper[4755]: I0317 01:52:06.274864 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a3566f-07ce-4705-b6db-9ab9f2789ffb" path="/var/lib/kubelet/pods/62a3566f-07ce-4705-b6db-9ab9f2789ffb/volumes" Mar 17 01:52:16 crc kubenswrapper[4755]: I0317 01:52:16.248371 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:52:16 crc kubenswrapper[4755]: E0317 01:52:16.249420 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:52:30 crc kubenswrapper[4755]: I0317 01:52:30.249549 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:52:30 crc kubenswrapper[4755]: E0317 01:52:30.250366 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:52:43 crc kubenswrapper[4755]: I0317 01:52:43.248908 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:52:43 crc kubenswrapper[4755]: E0317 01:52:43.250215 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:52:54 crc kubenswrapper[4755]: I0317 01:52:54.249335 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:52:54 crc kubenswrapper[4755]: E0317 01:52:54.250672 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:53:06 crc kubenswrapper[4755]: I0317 01:53:06.224272 4755 scope.go:117] "RemoveContainer" containerID="e1a5a3bc2fcc01369c0deebeb6144e3051676157175db9241ba4b5b6155194db" Mar 17 01:53:08 crc kubenswrapper[4755]: I0317 01:53:08.248805 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:53:08 crc kubenswrapper[4755]: E0317 01:53:08.249634 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:53:19 crc kubenswrapper[4755]: I0317 01:53:19.248290 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:53:19 crc kubenswrapper[4755]: E0317 01:53:19.249349 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:53:33 crc kubenswrapper[4755]: I0317 01:53:33.248078 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:53:33 crc kubenswrapper[4755]: E0317 01:53:33.249111 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:53:47 crc kubenswrapper[4755]: I0317 01:53:47.248481 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:53:47 crc kubenswrapper[4755]: E0317 01:53:47.249311 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.084541 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:53:56 crc kubenswrapper[4755]: E0317 01:53:56.087962 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a09072-1522-4650-ada2-0b45e9f44984" containerName="oc" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.087996 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a09072-1522-4650-ada2-0b45e9f44984" containerName="oc" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.088328 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a09072-1522-4650-ada2-0b45e9f44984" containerName="oc" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.092127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.112069 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.181743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.182321 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjpqp\" (UniqueName: \"kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.182372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.284485 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.284558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjpqp\" (UniqueName: \"kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.284613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.285426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.285710 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.303748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjpqp\" (UniqueName: \"kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp\") pod \"redhat-marketplace-xzbhz\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.430585 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:53:56 crc kubenswrapper[4755]: I0317 01:53:56.963994 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:53:57 crc kubenswrapper[4755]: I0317 01:53:57.257909 4755 generic.go:334] "Generic (PLEG): container finished" podID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerID="b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758" exitCode=0 Mar 17 01:53:57 crc kubenswrapper[4755]: I0317 01:53:57.257953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerDied","Data":"b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758"} Mar 17 01:53:57 crc kubenswrapper[4755]: I0317 01:53:57.258165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerStarted","Data":"c190b15cdeda07ad5087545972170afaf1ae50d88bdbf1fb7e867f461e76fca8"} Mar 17 01:53:57 crc kubenswrapper[4755]: I0317 01:53:57.272331 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:53:59 crc kubenswrapper[4755]: I0317 01:53:59.282039 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerStarted","Data":"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3"} Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.155666 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561874-p8vfx"] Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.157997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.170679 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-p8vfx"] Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.195265 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.195350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.195263 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.296533 4755 generic.go:334] "Generic (PLEG): container finished" podID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerID="9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3" exitCode=0 Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.297316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerDied","Data":"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3"} Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.298960 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlm46\" (UniqueName: \"kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46\") pod \"auto-csr-approver-29561874-p8vfx\" (UID: \"68d25cca-1c6f-4014-8e93-67fecd44920f\") " pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.402227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlm46\" (UniqueName: \"kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46\") pod \"auto-csr-approver-29561874-p8vfx\" (UID: \"68d25cca-1c6f-4014-8e93-67fecd44920f\") " pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:00 crc kubenswrapper[4755]: I0317 01:54:00.857301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlm46\" (UniqueName: \"kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46\") pod \"auto-csr-approver-29561874-p8vfx\" (UID: \"68d25cca-1c6f-4014-8e93-67fecd44920f\") " pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:01 crc kubenswrapper[4755]: I0317 01:54:01.117081 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:01 crc kubenswrapper[4755]: I0317 01:54:01.319976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerStarted","Data":"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a"} Mar 17 01:54:01 crc kubenswrapper[4755]: I0317 01:54:01.375903 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xzbhz" podStartSLOduration=1.8847006039999998 podStartE2EDuration="5.375874383s" podCreationTimestamp="2026-03-17 01:53:56 +0000 UTC" firstStartedPulling="2026-03-17 01:53:57.272075328 +0000 UTC m=+5512.031527621" lastFinishedPulling="2026-03-17 01:54:00.763249107 +0000 UTC m=+5515.522701400" observedRunningTime="2026-03-17 01:54:01.3422842 +0000 UTC m=+5516.101736493" watchObservedRunningTime="2026-03-17 01:54:01.375874383 +0000 UTC m=+5516.135326676" Mar 17 01:54:01 crc kubenswrapper[4755]: W0317 01:54:01.660725 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68d25cca_1c6f_4014_8e93_67fecd44920f.slice/crio-f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6 WatchSource:0}: Error finding container f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6: Status 404 returned error can't find the container with id f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6 Mar 17 01:54:01 crc kubenswrapper[4755]: I0317 01:54:01.670064 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-p8vfx"] Mar 17 01:54:02 crc kubenswrapper[4755]: I0317 01:54:02.248513 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:54:02 crc kubenswrapper[4755]: E0317 01:54:02.248938 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:54:02 crc kubenswrapper[4755]: I0317 01:54:02.336392 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" event={"ID":"68d25cca-1c6f-4014-8e93-67fecd44920f","Type":"ContainerStarted","Data":"f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6"} Mar 17 01:54:03 crc kubenswrapper[4755]: I0317 01:54:03.347662 4755 generic.go:334] "Generic (PLEG): container finished" podID="68d25cca-1c6f-4014-8e93-67fecd44920f" containerID="ea4fc3441a28807fe50c272d91fe5f43e33cbdbb7710c8a53085f0979ea95670" exitCode=0 Mar 17 01:54:03 crc kubenswrapper[4755]: I0317 01:54:03.347840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" event={"ID":"68d25cca-1c6f-4014-8e93-67fecd44920f","Type":"ContainerDied","Data":"ea4fc3441a28807fe50c272d91fe5f43e33cbdbb7710c8a53085f0979ea95670"} Mar 17 01:54:04 crc kubenswrapper[4755]: I0317 01:54:04.866632 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:04 crc kubenswrapper[4755]: I0317 01:54:04.905949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlm46\" (UniqueName: \"kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46\") pod \"68d25cca-1c6f-4014-8e93-67fecd44920f\" (UID: \"68d25cca-1c6f-4014-8e93-67fecd44920f\") " Mar 17 01:54:04 crc kubenswrapper[4755]: I0317 01:54:04.912828 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46" (OuterVolumeSpecName: "kube-api-access-xlm46") pod "68d25cca-1c6f-4014-8e93-67fecd44920f" (UID: "68d25cca-1c6f-4014-8e93-67fecd44920f"). InnerVolumeSpecName "kube-api-access-xlm46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.009035 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlm46\" (UniqueName: \"kubernetes.io/projected/68d25cca-1c6f-4014-8e93-67fecd44920f-kube-api-access-xlm46\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.377504 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" event={"ID":"68d25cca-1c6f-4014-8e93-67fecd44920f","Type":"ContainerDied","Data":"f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6"} Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.377576 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f442b31bedbfbce110fee30c74d122076816f713c16b56735e62c930fd207bb6" Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.377639 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561874-p8vfx" Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.971006 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-mprkf"] Mar 17 01:54:05 crc kubenswrapper[4755]: I0317 01:54:05.987751 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561868-mprkf"] Mar 17 01:54:06 crc kubenswrapper[4755]: I0317 01:54:06.265915 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247d94b9-dd90-4df1-8e89-3f211bdf29d1" path="/var/lib/kubelet/pods/247d94b9-dd90-4df1-8e89-3f211bdf29d1/volumes" Mar 17 01:54:06 crc kubenswrapper[4755]: I0317 01:54:06.347765 4755 scope.go:117] "RemoveContainer" containerID="f2483a137e8106127227f3f4fb7631b22e0329ae555df4d2689bb6e6c48c0539" Mar 17 01:54:06 crc kubenswrapper[4755]: I0317 01:54:06.431400 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:06 crc kubenswrapper[4755]: I0317 01:54:06.431864 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:06 crc kubenswrapper[4755]: I0317 01:54:06.488538 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:07 crc kubenswrapper[4755]: I0317 01:54:07.476004 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:07 crc kubenswrapper[4755]: I0317 01:54:07.547575 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:54:09 crc kubenswrapper[4755]: I0317 01:54:09.419400 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xzbhz" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="registry-server" containerID="cri-o://2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a" gracePeriod=2 Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.076618 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.137616 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjpqp\" (UniqueName: \"kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp\") pod \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.137787 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities\") pod \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.137839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content\") pod \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\" (UID: \"8545a96b-9a7a-47ab-aa74-cd60e3ee9381\") " Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.138577 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities" (OuterVolumeSpecName: "utilities") pod "8545a96b-9a7a-47ab-aa74-cd60e3ee9381" (UID: "8545a96b-9a7a-47ab-aa74-cd60e3ee9381"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.146618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp" (OuterVolumeSpecName: "kube-api-access-tjpqp") pod "8545a96b-9a7a-47ab-aa74-cd60e3ee9381" (UID: "8545a96b-9a7a-47ab-aa74-cd60e3ee9381"). InnerVolumeSpecName "kube-api-access-tjpqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.162213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8545a96b-9a7a-47ab-aa74-cd60e3ee9381" (UID: "8545a96b-9a7a-47ab-aa74-cd60e3ee9381"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.240828 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjpqp\" (UniqueName: \"kubernetes.io/projected/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-kube-api-access-tjpqp\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.240887 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.240899 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8545a96b-9a7a-47ab-aa74-cd60e3ee9381-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.439366 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzbhz" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.439371 4755 generic.go:334] "Generic (PLEG): container finished" podID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerID="2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a" exitCode=0 Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.439424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerDied","Data":"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a"} Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.440062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzbhz" event={"ID":"8545a96b-9a7a-47ab-aa74-cd60e3ee9381","Type":"ContainerDied","Data":"c190b15cdeda07ad5087545972170afaf1ae50d88bdbf1fb7e867f461e76fca8"} Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.440081 4755 scope.go:117] "RemoveContainer" containerID="2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.472797 4755 scope.go:117] "RemoveContainer" containerID="9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.473702 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.493520 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzbhz"] Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.502595 4755 scope.go:117] "RemoveContainer" containerID="b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.546924 4755 scope.go:117] "RemoveContainer" containerID="2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a" Mar 17 01:54:10 crc kubenswrapper[4755]: E0317 01:54:10.547422 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a\": container with ID starting with 2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a not found: ID does not exist" containerID="2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.547468 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a"} err="failed to get container status \"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a\": rpc error: code = NotFound desc = could not find container \"2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a\": container with ID starting with 2d056e1cb71898571a9554adea35390a4e128666d21ea1effd682e5683ceaa6a not found: ID does not exist" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.547488 4755 scope.go:117] "RemoveContainer" containerID="9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3" Mar 17 01:54:10 crc kubenswrapper[4755]: E0317 01:54:10.547758 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3\": container with ID starting with 9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3 not found: ID does not exist" containerID="9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.547782 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3"} err="failed to get container status \"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3\": rpc error: code = NotFound desc = could not find container \"9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3\": container with ID starting with 9dc4a5cea429d1481cb389058427de9011873168f15ec1d9f8b9129826ec0ef3 not found: ID does not exist" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.547794 4755 scope.go:117] "RemoveContainer" containerID="b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758" Mar 17 01:54:10 crc kubenswrapper[4755]: E0317 01:54:10.548112 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758\": container with ID starting with b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758 not found: ID does not exist" containerID="b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758" Mar 17 01:54:10 crc kubenswrapper[4755]: I0317 01:54:10.548135 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758"} err="failed to get container status \"b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758\": rpc error: code = NotFound desc = could not find container \"b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758\": container with ID starting with b74d0266dc88a71f05a5b9208d6e64e456384e513cc7633ecd9ec8044f818758 not found: ID does not exist" Mar 17 01:54:12 crc kubenswrapper[4755]: I0317 01:54:12.259559 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" path="/var/lib/kubelet/pods/8545a96b-9a7a-47ab-aa74-cd60e3ee9381/volumes" Mar 17 01:54:17 crc kubenswrapper[4755]: I0317 01:54:17.248724 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:54:17 crc kubenswrapper[4755]: E0317 01:54:17.250890 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.645943 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54fgj"] Mar 17 01:54:19 crc kubenswrapper[4755]: E0317 01:54:19.646644 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="registry-server" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646656 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="registry-server" Mar 17 01:54:19 crc kubenswrapper[4755]: E0317 01:54:19.646671 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="extract-content" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646679 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="extract-content" Mar 17 01:54:19 crc kubenswrapper[4755]: E0317 01:54:19.646689 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d25cca-1c6f-4014-8e93-67fecd44920f" containerName="oc" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646694 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d25cca-1c6f-4014-8e93-67fecd44920f" containerName="oc" Mar 17 01:54:19 crc kubenswrapper[4755]: E0317 01:54:19.646722 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="extract-utilities" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646728 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="extract-utilities" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646928 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8545a96b-9a7a-47ab-aa74-cd60e3ee9381" containerName="registry-server" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.646952 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d25cca-1c6f-4014-8e93-67fecd44920f" containerName="oc" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.648898 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.668635 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54fgj"] Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.717837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-utilities\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.718088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-catalog-content\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.718132 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9dh\" (UniqueName: \"kubernetes.io/projected/77550708-f4d6-4bd2-901a-bad2b1813e2b-kube-api-access-xs9dh\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.824584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-catalog-content\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.824703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9dh\" (UniqueName: \"kubernetes.io/projected/77550708-f4d6-4bd2-901a-bad2b1813e2b-kube-api-access-xs9dh\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.824951 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-utilities\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.826691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-utilities\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.826830 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77550708-f4d6-4bd2-901a-bad2b1813e2b-catalog-content\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.858519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9dh\" (UniqueName: \"kubernetes.io/projected/77550708-f4d6-4bd2-901a-bad2b1813e2b-kube-api-access-xs9dh\") pod \"certified-operators-54fgj\" (UID: \"77550708-f4d6-4bd2-901a-bad2b1813e2b\") " pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:19 crc kubenswrapper[4755]: I0317 01:54:19.977458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:20 crc kubenswrapper[4755]: I0317 01:54:20.521388 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54fgj"] Mar 17 01:54:20 crc kubenswrapper[4755]: W0317 01:54:20.526796 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77550708_f4d6_4bd2_901a_bad2b1813e2b.slice/crio-24527f4357f49801bf178ef5af11797e9c851bc5ae54bbf49ed91b35e8e59052 WatchSource:0}: Error finding container 24527f4357f49801bf178ef5af11797e9c851bc5ae54bbf49ed91b35e8e59052: Status 404 returned error can't find the container with id 24527f4357f49801bf178ef5af11797e9c851bc5ae54bbf49ed91b35e8e59052 Mar 17 01:54:20 crc kubenswrapper[4755]: I0317 01:54:20.572352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54fgj" event={"ID":"77550708-f4d6-4bd2-901a-bad2b1813e2b","Type":"ContainerStarted","Data":"24527f4357f49801bf178ef5af11797e9c851bc5ae54bbf49ed91b35e8e59052"} Mar 17 01:54:21 crc kubenswrapper[4755]: I0317 01:54:21.582719 4755 generic.go:334] "Generic (PLEG): container finished" podID="77550708-f4d6-4bd2-901a-bad2b1813e2b" containerID="8d5ed3f3f66cdabcfecf32851fb41f5ee641350e7933626f94d2cdcc47d82989" exitCode=0 Mar 17 01:54:21 crc kubenswrapper[4755]: I0317 01:54:21.582955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54fgj" event={"ID":"77550708-f4d6-4bd2-901a-bad2b1813e2b","Type":"ContainerDied","Data":"8d5ed3f3f66cdabcfecf32851fb41f5ee641350e7933626f94d2cdcc47d82989"} Mar 17 01:54:28 crc kubenswrapper[4755]: I0317 01:54:28.668252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54fgj" event={"ID":"77550708-f4d6-4bd2-901a-bad2b1813e2b","Type":"ContainerStarted","Data":"0f216dc1f7bfce7675d361d550b16c7214b19ac2c913549fe115f8cc1f7fdf1e"} Mar 17 01:54:29 crc kubenswrapper[4755]: I0317 01:54:29.248532 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:54:29 crc kubenswrapper[4755]: I0317 01:54:29.683847 4755 generic.go:334] "Generic (PLEG): container finished" podID="77550708-f4d6-4bd2-901a-bad2b1813e2b" containerID="0f216dc1f7bfce7675d361d550b16c7214b19ac2c913549fe115f8cc1f7fdf1e" exitCode=0 Mar 17 01:54:29 crc kubenswrapper[4755]: I0317 01:54:29.683989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54fgj" event={"ID":"77550708-f4d6-4bd2-901a-bad2b1813e2b","Type":"ContainerDied","Data":"0f216dc1f7bfce7675d361d550b16c7214b19ac2c913549fe115f8cc1f7fdf1e"} Mar 17 01:54:29 crc kubenswrapper[4755]: I0317 01:54:29.688503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190"} Mar 17 01:54:30 crc kubenswrapper[4755]: I0317 01:54:30.711196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54fgj" event={"ID":"77550708-f4d6-4bd2-901a-bad2b1813e2b","Type":"ContainerStarted","Data":"500516ed7f939bbb3c3328cdd9e51a2c4c90917527b2f3a5c3ee7100a86adacf"} Mar 17 01:54:30 crc kubenswrapper[4755]: I0317 01:54:30.752314 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54fgj" podStartSLOduration=3.08843202 podStartE2EDuration="11.752294395s" podCreationTimestamp="2026-03-17 01:54:19 +0000 UTC" firstStartedPulling="2026-03-17 01:54:21.585375423 +0000 UTC m=+5536.344827706" lastFinishedPulling="2026-03-17 01:54:30.249237788 +0000 UTC m=+5545.008690081" observedRunningTime="2026-03-17 01:54:30.742435237 +0000 UTC m=+5545.501887530" watchObservedRunningTime="2026-03-17 01:54:30.752294395 +0000 UTC m=+5545.511746688" Mar 17 01:54:39 crc kubenswrapper[4755]: I0317 01:54:39.977810 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:39 crc kubenswrapper[4755]: I0317 01:54:39.978295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:41 crc kubenswrapper[4755]: I0317 01:54:41.041208 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-54fgj" podUID="77550708-f4d6-4bd2-901a-bad2b1813e2b" containerName="registry-server" probeResult="failure" output=< Mar 17 01:54:41 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:54:41 crc kubenswrapper[4755]: > Mar 17 01:54:50 crc kubenswrapper[4755]: I0317 01:54:50.048374 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:50 crc kubenswrapper[4755]: I0317 01:54:50.123412 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54fgj" Mar 17 01:54:50 crc kubenswrapper[4755]: I0317 01:54:50.713558 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54fgj"] Mar 17 01:54:50 crc kubenswrapper[4755]: I0317 01:54:50.862505 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:54:50 crc kubenswrapper[4755]: I0317 01:54:50.862985 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r2vmw" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="registry-server" containerID="cri-o://4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4" gracePeriod=2 Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.446910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.499202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content\") pod \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.499257 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thcpq\" (UniqueName: \"kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq\") pod \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.499451 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities\") pod \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\" (UID: \"ee6f1763-281b-4a4d-a0ae-2665122ed9b3\") " Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.500880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities" (OuterVolumeSpecName: "utilities") pod "ee6f1763-281b-4a4d-a0ae-2665122ed9b3" (UID: "ee6f1763-281b-4a4d-a0ae-2665122ed9b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.506935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq" (OuterVolumeSpecName: "kube-api-access-thcpq") pod "ee6f1763-281b-4a4d-a0ae-2665122ed9b3" (UID: "ee6f1763-281b-4a4d-a0ae-2665122ed9b3"). InnerVolumeSpecName "kube-api-access-thcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.583263 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee6f1763-281b-4a4d-a0ae-2665122ed9b3" (UID: "ee6f1763-281b-4a4d-a0ae-2665122ed9b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.602733 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.602777 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.602791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thcpq\" (UniqueName: \"kubernetes.io/projected/ee6f1763-281b-4a4d-a0ae-2665122ed9b3-kube-api-access-thcpq\") on node \"crc\" DevicePath \"\"" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.969629 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerID="4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4" exitCode=0 Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.969738 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2vmw" Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.969761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerDied","Data":"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4"} Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.971313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2vmw" event={"ID":"ee6f1763-281b-4a4d-a0ae-2665122ed9b3","Type":"ContainerDied","Data":"af8267a85848d58fc5dd1005b6bdc3c3db1a8ec032e653b5cc7ce46ac5ec56be"} Mar 17 01:54:51 crc kubenswrapper[4755]: I0317 01:54:51.972285 4755 scope.go:117] "RemoveContainer" containerID="4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.026755 4755 scope.go:117] "RemoveContainer" containerID="4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.031665 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.041973 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r2vmw"] Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.058763 4755 scope.go:117] "RemoveContainer" containerID="6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.117891 4755 scope.go:117] "RemoveContainer" containerID="4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4" Mar 17 01:54:52 crc kubenswrapper[4755]: E0317 01:54:52.118417 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4\": container with ID starting with 4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4 not found: ID does not exist" containerID="4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.118460 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4"} err="failed to get container status \"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4\": rpc error: code = NotFound desc = could not find container \"4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4\": container with ID starting with 4d419de0b5df6a659848a4ba3aa4a807a053436151903e112f691922c6dcbfe4 not found: ID does not exist" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.118484 4755 scope.go:117] "RemoveContainer" containerID="4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7" Mar 17 01:54:52 crc kubenswrapper[4755]: E0317 01:54:52.118857 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7\": container with ID starting with 4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7 not found: ID does not exist" containerID="4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.118889 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7"} err="failed to get container status \"4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7\": rpc error: code = NotFound desc = could not find container \"4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7\": container with ID starting with 4f45208792d07662e8839a5189e3bfa7b60ca70d5c7525c4fd4e08295d0255e7 not found: ID does not exist" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.118902 4755 scope.go:117] "RemoveContainer" containerID="6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5" Mar 17 01:54:52 crc kubenswrapper[4755]: E0317 01:54:52.120072 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5\": container with ID starting with 6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5 not found: ID does not exist" containerID="6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.120094 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5"} err="failed to get container status \"6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5\": rpc error: code = NotFound desc = could not find container \"6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5\": container with ID starting with 6b0f09cb88475a18ee0261a29af2cd8d93fdecd19a7bb7e1407a552e03d677c5 not found: ID does not exist" Mar 17 01:54:52 crc kubenswrapper[4755]: I0317 01:54:52.268534 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" path="/var/lib/kubelet/pods/ee6f1763-281b-4a4d-a0ae-2665122ed9b3/volumes" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.151183 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561876-sh2s9"] Mar 17 01:56:00 crc kubenswrapper[4755]: E0317 01:56:00.152468 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="extract-utilities" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.152487 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="extract-utilities" Mar 17 01:56:00 crc kubenswrapper[4755]: E0317 01:56:00.152519 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="extract-content" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.152529 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="extract-content" Mar 17 01:56:00 crc kubenswrapper[4755]: E0317 01:56:00.152554 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="registry-server" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.152563 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="registry-server" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.152846 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6f1763-281b-4a4d-a0ae-2665122ed9b3" containerName="registry-server" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.153826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.156457 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.156634 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.156793 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.167848 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-sh2s9"] Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.272858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5jtr\" (UniqueName: \"kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr\") pod \"auto-csr-approver-29561876-sh2s9\" (UID: \"54f9a408-67e3-4bc3-9638-46fe78f7757f\") " pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.376866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5jtr\" (UniqueName: \"kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr\") pod \"auto-csr-approver-29561876-sh2s9\" (UID: \"54f9a408-67e3-4bc3-9638-46fe78f7757f\") " pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.401263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5jtr\" (UniqueName: \"kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr\") pod \"auto-csr-approver-29561876-sh2s9\" (UID: \"54f9a408-67e3-4bc3-9638-46fe78f7757f\") " pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:00 crc kubenswrapper[4755]: I0317 01:56:00.480031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:01 crc kubenswrapper[4755]: W0317 01:56:01.011333 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f9a408_67e3_4bc3_9638_46fe78f7757f.slice/crio-69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d WatchSource:0}: Error finding container 69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d: Status 404 returned error can't find the container with id 69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d Mar 17 01:56:01 crc kubenswrapper[4755]: I0317 01:56:01.013481 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-sh2s9"] Mar 17 01:56:01 crc kubenswrapper[4755]: I0317 01:56:01.935227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" event={"ID":"54f9a408-67e3-4bc3-9638-46fe78f7757f","Type":"ContainerStarted","Data":"69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d"} Mar 17 01:56:02 crc kubenswrapper[4755]: I0317 01:56:02.953032 4755 generic.go:334] "Generic (PLEG): container finished" podID="54f9a408-67e3-4bc3-9638-46fe78f7757f" containerID="5a29debe2a5d275ca8e1d91f7aadb8593d31a174918d0bb8d40ecc7fee109552" exitCode=0 Mar 17 01:56:02 crc kubenswrapper[4755]: I0317 01:56:02.953137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" event={"ID":"54f9a408-67e3-4bc3-9638-46fe78f7757f","Type":"ContainerDied","Data":"5a29debe2a5d275ca8e1d91f7aadb8593d31a174918d0bb8d40ecc7fee109552"} Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.443512 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.481619 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5jtr\" (UniqueName: \"kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr\") pod \"54f9a408-67e3-4bc3-9638-46fe78f7757f\" (UID: \"54f9a408-67e3-4bc3-9638-46fe78f7757f\") " Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.492817 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr" (OuterVolumeSpecName: "kube-api-access-x5jtr") pod "54f9a408-67e3-4bc3-9638-46fe78f7757f" (UID: "54f9a408-67e3-4bc3-9638-46fe78f7757f"). InnerVolumeSpecName "kube-api-access-x5jtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.585322 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5jtr\" (UniqueName: \"kubernetes.io/projected/54f9a408-67e3-4bc3-9638-46fe78f7757f-kube-api-access-x5jtr\") on node \"crc\" DevicePath \"\"" Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.979057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" event={"ID":"54f9a408-67e3-4bc3-9638-46fe78f7757f","Type":"ContainerDied","Data":"69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d"} Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.979112 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69c0270a587a7c395cb0556fc678172832bdd82aee94f9609d29c6a78dfa811d" Mar 17 01:56:04 crc kubenswrapper[4755]: I0317 01:56:04.979116 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561876-sh2s9" Mar 17 01:56:05 crc kubenswrapper[4755]: I0317 01:56:05.533416 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-hkl6h"] Mar 17 01:56:05 crc kubenswrapper[4755]: I0317 01:56:05.575078 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561870-hkl6h"] Mar 17 01:56:06 crc kubenswrapper[4755]: I0317 01:56:06.269232 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704de771-367b-49f1-aa0a-8ecd53f733ea" path="/var/lib/kubelet/pods/704de771-367b-49f1-aa0a-8ecd53f733ea/volumes" Mar 17 01:56:06 crc kubenswrapper[4755]: I0317 01:56:06.539347 4755 scope.go:117] "RemoveContainer" containerID="f1c0db0a41a037daf6dd508dc9b495f4cc1a753008d0eda39c431f3308455f39" Mar 17 01:56:58 crc kubenswrapper[4755]: I0317 01:56:58.665213 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:56:58 crc kubenswrapper[4755]: I0317 01:56:58.666259 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:57:28 crc kubenswrapper[4755]: I0317 01:57:28.665468 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:57:28 crc kubenswrapper[4755]: I0317 01:57:28.666071 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:57:43 crc kubenswrapper[4755]: E0317 01:57:43.928772 4755 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.32:35886->38.102.83.32:36119: read tcp 38.102.83.32:35886->38.102.83.32:36119: read: connection reset by peer Mar 17 01:57:58 crc kubenswrapper[4755]: I0317 01:57:58.666062 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 01:57:58 crc kubenswrapper[4755]: I0317 01:57:58.666959 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 01:57:58 crc kubenswrapper[4755]: I0317 01:57:58.667040 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 01:57:58 crc kubenswrapper[4755]: I0317 01:57:58.668476 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 01:57:58 crc kubenswrapper[4755]: I0317 01:57:58.668581 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190" gracePeriod=600 Mar 17 01:57:59 crc kubenswrapper[4755]: I0317 01:57:59.528363 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190" exitCode=0 Mar 17 01:57:59 crc kubenswrapper[4755]: I0317 01:57:59.528480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190"} Mar 17 01:57:59 crc kubenswrapper[4755]: I0317 01:57:59.529566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8"} Mar 17 01:57:59 crc kubenswrapper[4755]: I0317 01:57:59.529645 4755 scope.go:117] "RemoveContainer" containerID="c2280e6a6a3c11519092571874c237a72d1ba53a6f115e379a8a574b379f597e" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.144980 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561878-8dp55"] Mar 17 01:58:00 crc kubenswrapper[4755]: E0317 01:58:00.145866 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f9a408-67e3-4bc3-9638-46fe78f7757f" containerName="oc" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.145882 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f9a408-67e3-4bc3-9638-46fe78f7757f" containerName="oc" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.146157 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f9a408-67e3-4bc3-9638-46fe78f7757f" containerName="oc" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.147188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.149568 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.149692 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.152338 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.156632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-8dp55"] Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.267233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbszg\" (UniqueName: \"kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg\") pod \"auto-csr-approver-29561878-8dp55\" (UID: \"9c07e080-a2ca-4991-b83a-3c7f40325574\") " pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.369906 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbszg\" (UniqueName: \"kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg\") pod \"auto-csr-approver-29561878-8dp55\" (UID: \"9c07e080-a2ca-4991-b83a-3c7f40325574\") " pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.392819 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbszg\" (UniqueName: \"kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg\") pod \"auto-csr-approver-29561878-8dp55\" (UID: \"9c07e080-a2ca-4991-b83a-3c7f40325574\") " pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:00 crc kubenswrapper[4755]: I0317 01:58:00.477033 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:01 crc kubenswrapper[4755]: I0317 01:58:01.009712 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-8dp55"] Mar 17 01:58:01 crc kubenswrapper[4755]: I0317 01:58:01.593424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-8dp55" event={"ID":"9c07e080-a2ca-4991-b83a-3c7f40325574","Type":"ContainerStarted","Data":"cbd2fed90a4f0af51dda6609ab524c9f4ec76fe6d89e5ea724b62c5be328b455"} Mar 17 01:58:03 crc kubenswrapper[4755]: I0317 01:58:03.623758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-8dp55" event={"ID":"9c07e080-a2ca-4991-b83a-3c7f40325574","Type":"ContainerStarted","Data":"8619e1e5097e43b0e53fd0378cdcd7c4e1a661bb365e1b56723ccb4fefcc046a"} Mar 17 01:58:03 crc kubenswrapper[4755]: I0317 01:58:03.649121 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561878-8dp55" podStartSLOduration=2.542941136 podStartE2EDuration="3.649093414s" podCreationTimestamp="2026-03-17 01:58:00 +0000 UTC" firstStartedPulling="2026-03-17 01:58:01.582753091 +0000 UTC m=+5756.342205374" lastFinishedPulling="2026-03-17 01:58:02.688905329 +0000 UTC m=+5757.448357652" observedRunningTime="2026-03-17 01:58:03.643706771 +0000 UTC m=+5758.403159094" watchObservedRunningTime="2026-03-17 01:58:03.649093414 +0000 UTC m=+5758.408545767" Mar 17 01:58:04 crc kubenswrapper[4755]: I0317 01:58:04.637738 4755 generic.go:334] "Generic (PLEG): container finished" podID="9c07e080-a2ca-4991-b83a-3c7f40325574" containerID="8619e1e5097e43b0e53fd0378cdcd7c4e1a661bb365e1b56723ccb4fefcc046a" exitCode=0 Mar 17 01:58:04 crc kubenswrapper[4755]: I0317 01:58:04.637846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-8dp55" event={"ID":"9c07e080-a2ca-4991-b83a-3c7f40325574","Type":"ContainerDied","Data":"8619e1e5097e43b0e53fd0378cdcd7c4e1a661bb365e1b56723ccb4fefcc046a"} Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.105848 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.242105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbszg\" (UniqueName: \"kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg\") pod \"9c07e080-a2ca-4991-b83a-3c7f40325574\" (UID: \"9c07e080-a2ca-4991-b83a-3c7f40325574\") " Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.247629 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg" (OuterVolumeSpecName: "kube-api-access-nbszg") pod "9c07e080-a2ca-4991-b83a-3c7f40325574" (UID: "9c07e080-a2ca-4991-b83a-3c7f40325574"). InnerVolumeSpecName "kube-api-access-nbszg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.345273 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbszg\" (UniqueName: \"kubernetes.io/projected/9c07e080-a2ca-4991-b83a-3c7f40325574-kube-api-access-nbszg\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.677571 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561878-8dp55" Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.679775 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561878-8dp55" event={"ID":"9c07e080-a2ca-4991-b83a-3c7f40325574","Type":"ContainerDied","Data":"cbd2fed90a4f0af51dda6609ab524c9f4ec76fe6d89e5ea724b62c5be328b455"} Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.680040 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd2fed90a4f0af51dda6609ab524c9f4ec76fe6d89e5ea724b62c5be328b455" Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.736236 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-k85xm"] Mar 17 01:58:06 crc kubenswrapper[4755]: I0317 01:58:06.749398 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561872-k85xm"] Mar 17 01:58:08 crc kubenswrapper[4755]: I0317 01:58:08.265815 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a09072-1522-4650-ada2-0b45e9f44984" path="/var/lib/kubelet/pods/c3a09072-1522-4650-ada2-0b45e9f44984/volumes" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.340723 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:09 crc kubenswrapper[4755]: E0317 01:58:09.341780 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c07e080-a2ca-4991-b83a-3c7f40325574" containerName="oc" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.341802 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c07e080-a2ca-4991-b83a-3c7f40325574" containerName="oc" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.342259 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c07e080-a2ca-4991-b83a-3c7f40325574" containerName="oc" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.344471 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.357608 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.524885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.525234 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnfxb\" (UniqueName: \"kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.525287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.627099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.627187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.627322 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnfxb\" (UniqueName: \"kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.627661 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.627858 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.652243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnfxb\" (UniqueName: \"kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb\") pod \"community-operators-w8j9m\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:09 crc kubenswrapper[4755]: I0317 01:58:09.678892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:10 crc kubenswrapper[4755]: I0317 01:58:10.182800 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:10 crc kubenswrapper[4755]: I0317 01:58:10.721520 4755 generic.go:334] "Generic (PLEG): container finished" podID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerID="46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3" exitCode=0 Mar 17 01:58:10 crc kubenswrapper[4755]: I0317 01:58:10.721583 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerDied","Data":"46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3"} Mar 17 01:58:10 crc kubenswrapper[4755]: I0317 01:58:10.721832 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerStarted","Data":"89e135204f1a8ddfe54b877b8ab43fdef526416981145c24b690de14ebfb6940"} Mar 17 01:58:11 crc kubenswrapper[4755]: I0317 01:58:11.735462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerStarted","Data":"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74"} Mar 17 01:58:13 crc kubenswrapper[4755]: I0317 01:58:13.754038 4755 generic.go:334] "Generic (PLEG): container finished" podID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerID="df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74" exitCode=0 Mar 17 01:58:13 crc kubenswrapper[4755]: I0317 01:58:13.754266 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerDied","Data":"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74"} Mar 17 01:58:14 crc kubenswrapper[4755]: I0317 01:58:14.765530 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerStarted","Data":"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef"} Mar 17 01:58:14 crc kubenswrapper[4755]: I0317 01:58:14.797910 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8j9m" podStartSLOduration=2.300991118 podStartE2EDuration="5.797893666s" podCreationTimestamp="2026-03-17 01:58:09 +0000 UTC" firstStartedPulling="2026-03-17 01:58:10.724583072 +0000 UTC m=+5765.484035395" lastFinishedPulling="2026-03-17 01:58:14.22148562 +0000 UTC m=+5768.980937943" observedRunningTime="2026-03-17 01:58:14.789935886 +0000 UTC m=+5769.549388209" watchObservedRunningTime="2026-03-17 01:58:14.797893666 +0000 UTC m=+5769.557345949" Mar 17 01:58:19 crc kubenswrapper[4755]: I0317 01:58:19.679480 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:19 crc kubenswrapper[4755]: I0317 01:58:19.681462 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:19 crc kubenswrapper[4755]: I0317 01:58:19.780030 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:19 crc kubenswrapper[4755]: I0317 01:58:19.926661 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:20 crc kubenswrapper[4755]: I0317 01:58:20.057643 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:21 crc kubenswrapper[4755]: I0317 01:58:21.879978 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8j9m" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="registry-server" containerID="cri-o://3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef" gracePeriod=2 Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.459411 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.596745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities\") pod \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.596812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnfxb\" (UniqueName: \"kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb\") pod \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.596889 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content\") pod \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\" (UID: \"b6600f3f-d5c4-476a-b791-74ce8fed9ce8\") " Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.597601 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities" (OuterVolumeSpecName: "utilities") pod "b6600f3f-d5c4-476a-b791-74ce8fed9ce8" (UID: "b6600f3f-d5c4-476a-b791-74ce8fed9ce8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.610327 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb" (OuterVolumeSpecName: "kube-api-access-qnfxb") pod "b6600f3f-d5c4-476a-b791-74ce8fed9ce8" (UID: "b6600f3f-d5c4-476a-b791-74ce8fed9ce8"). InnerVolumeSpecName "kube-api-access-qnfxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.674721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6600f3f-d5c4-476a-b791-74ce8fed9ce8" (UID: "b6600f3f-d5c4-476a-b791-74ce8fed9ce8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.699695 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.699727 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.699737 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnfxb\" (UniqueName: \"kubernetes.io/projected/b6600f3f-d5c4-476a-b791-74ce8fed9ce8-kube-api-access-qnfxb\") on node \"crc\" DevicePath \"\"" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.926569 4755 generic.go:334] "Generic (PLEG): container finished" podID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerID="3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef" exitCode=0 Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.926660 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerDied","Data":"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef"} Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.927032 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8j9m" event={"ID":"b6600f3f-d5c4-476a-b791-74ce8fed9ce8","Type":"ContainerDied","Data":"89e135204f1a8ddfe54b877b8ab43fdef526416981145c24b690de14ebfb6940"} Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.927069 4755 scope.go:117] "RemoveContainer" containerID="3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.926705 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8j9m" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.975829 4755 scope.go:117] "RemoveContainer" containerID="df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74" Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.986264 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:22 crc kubenswrapper[4755]: I0317 01:58:22.996860 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8j9m"] Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.023290 4755 scope.go:117] "RemoveContainer" containerID="46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.079165 4755 scope.go:117] "RemoveContainer" containerID="3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef" Mar 17 01:58:23 crc kubenswrapper[4755]: E0317 01:58:23.079579 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef\": container with ID starting with 3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef not found: ID does not exist" containerID="3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.079615 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef"} err="failed to get container status \"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef\": rpc error: code = NotFound desc = could not find container \"3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef\": container with ID starting with 3ae1e3e447ed9439ce7e8763caa4d9a311534661baa9664f5369f6990abe32ef not found: ID does not exist" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.079639 4755 scope.go:117] "RemoveContainer" containerID="df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74" Mar 17 01:58:23 crc kubenswrapper[4755]: E0317 01:58:23.079879 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74\": container with ID starting with df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74 not found: ID does not exist" containerID="df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.079927 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74"} err="failed to get container status \"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74\": rpc error: code = NotFound desc = could not find container \"df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74\": container with ID starting with df57da927aa768a4bb6e450b2e5b950a4baeb912e74b6468ca17170609b88a74 not found: ID does not exist" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.079959 4755 scope.go:117] "RemoveContainer" containerID="46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3" Mar 17 01:58:23 crc kubenswrapper[4755]: E0317 01:58:23.080270 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3\": container with ID starting with 46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3 not found: ID does not exist" containerID="46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3" Mar 17 01:58:23 crc kubenswrapper[4755]: I0317 01:58:23.080300 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3"} err="failed to get container status \"46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3\": rpc error: code = NotFound desc = could not find container \"46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3\": container with ID starting with 46ac24522e688e7922cae6f14e8efa85d40e27e7ee2a17ce3abb29eb672209a3 not found: ID does not exist" Mar 17 01:58:24 crc kubenswrapper[4755]: I0317 01:58:24.267944 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" path="/var/lib/kubelet/pods/b6600f3f-d5c4-476a-b791-74ce8fed9ce8/volumes" Mar 17 01:58:40 crc kubenswrapper[4755]: E0317 01:58:40.086340 4755 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:53192->38.102.83.32:36119: write tcp 38.102.83.32:53192->38.102.83.32:36119: write: broken pipe Mar 17 01:59:06 crc kubenswrapper[4755]: I0317 01:59:06.698991 4755 scope.go:117] "RemoveContainer" containerID="7e5c18dc59e4a612bf95b599ab70e29e103feeb1015883235ed5b6f44109125d" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.531729 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 01:59:45 crc kubenswrapper[4755]: E0317 01:59:45.532946 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="extract-content" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.532963 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="extract-content" Mar 17 01:59:45 crc kubenswrapper[4755]: E0317 01:59:45.532989 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="extract-utilities" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.532998 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="extract-utilities" Mar 17 01:59:45 crc kubenswrapper[4755]: E0317 01:59:45.533045 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="registry-server" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.533053 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="registry-server" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.533306 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6600f3f-d5c4-476a-b791-74ce8fed9ce8" containerName="registry-server" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.535015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.550488 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.653599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.653681 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.653777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwpp\" (UniqueName: \"kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.755826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.755876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.755931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwpp\" (UniqueName: \"kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.756474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.756636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.779523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwpp\" (UniqueName: \"kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp\") pod \"redhat-operators-mqksh\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:45 crc kubenswrapper[4755]: I0317 01:59:45.865682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:46 crc kubenswrapper[4755]: I0317 01:59:46.422259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 01:59:47 crc kubenswrapper[4755]: I0317 01:59:47.076684 4755 generic.go:334] "Generic (PLEG): container finished" podID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerID="2fb3a6544ec2b8fa108d234917e0e8d042a7014d03e0afaf1186e7f388f4cae7" exitCode=0 Mar 17 01:59:47 crc kubenswrapper[4755]: I0317 01:59:47.077016 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerDied","Data":"2fb3a6544ec2b8fa108d234917e0e8d042a7014d03e0afaf1186e7f388f4cae7"} Mar 17 01:59:47 crc kubenswrapper[4755]: I0317 01:59:47.077044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerStarted","Data":"f2a79364f48d5b2bd7560744c8b18b777682de06a01b133b8f1d9516a5d9b5d0"} Mar 17 01:59:47 crc kubenswrapper[4755]: I0317 01:59:47.082903 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 01:59:48 crc kubenswrapper[4755]: I0317 01:59:48.087837 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerStarted","Data":"37e3cb548b6b98773cb5a3ec114a7e4e05575b25a8ddcc9e22ba702bc02eeb2c"} Mar 17 01:59:54 crc kubenswrapper[4755]: I0317 01:59:54.150461 4755 generic.go:334] "Generic (PLEG): container finished" podID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerID="37e3cb548b6b98773cb5a3ec114a7e4e05575b25a8ddcc9e22ba702bc02eeb2c" exitCode=0 Mar 17 01:59:54 crc kubenswrapper[4755]: I0317 01:59:54.150699 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerDied","Data":"37e3cb548b6b98773cb5a3ec114a7e4e05575b25a8ddcc9e22ba702bc02eeb2c"} Mar 17 01:59:55 crc kubenswrapper[4755]: I0317 01:59:55.164295 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerStarted","Data":"4c1b960a91c40e446d75742ac3fb1021463c7f55d8f6f342bfc6ec31a062cedb"} Mar 17 01:59:55 crc kubenswrapper[4755]: I0317 01:59:55.208258 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mqksh" podStartSLOduration=2.753227781 podStartE2EDuration="10.208236184s" podCreationTimestamp="2026-03-17 01:59:45 +0000 UTC" firstStartedPulling="2026-03-17 01:59:47.082672362 +0000 UTC m=+5861.842124645" lastFinishedPulling="2026-03-17 01:59:54.537680755 +0000 UTC m=+5869.297133048" observedRunningTime="2026-03-17 01:59:55.191167942 +0000 UTC m=+5869.950620295" watchObservedRunningTime="2026-03-17 01:59:55.208236184 +0000 UTC m=+5869.967688477" Mar 17 01:59:55 crc kubenswrapper[4755]: I0317 01:59:55.866198 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:55 crc kubenswrapper[4755]: I0317 01:59:55.866643 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 01:59:56 crc kubenswrapper[4755]: I0317 01:59:56.934968 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqksh" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" probeResult="failure" output=< Mar 17 01:59:56 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 01:59:56 crc kubenswrapper[4755]: > Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.156850 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l"] Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.158662 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.161645 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.163238 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.169319 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561880-qc4ld"] Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.171259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.175620 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.181252 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.181312 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.185171 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l"] Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.211852 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-qc4ld"] Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.333174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.333275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.333301 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm54s\" (UniqueName: \"kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.333527 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkbl\" (UniqueName: \"kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl\") pod \"auto-csr-approver-29561880-qc4ld\" (UID: \"c2d16a08-f9f1-4643-ae69-4f51a6ddc316\") " pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.436234 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.436347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.436374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm54s\" (UniqueName: \"kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.436457 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkbl\" (UniqueName: \"kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl\") pod \"auto-csr-approver-29561880-qc4ld\" (UID: \"c2d16a08-f9f1-4643-ae69-4f51a6ddc316\") " pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.438117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.454177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.456259 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkbl\" (UniqueName: \"kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl\") pod \"auto-csr-approver-29561880-qc4ld\" (UID: \"c2d16a08-f9f1-4643-ae69-4f51a6ddc316\") " pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.475131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm54s\" (UniqueName: \"kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s\") pod \"collect-profiles-29561880-v4w6l\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.489384 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:00 crc kubenswrapper[4755]: I0317 02:00:00.503340 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:01 crc kubenswrapper[4755]: I0317 02:00:01.022262 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l"] Mar 17 02:00:01 crc kubenswrapper[4755]: W0317 02:00:01.137999 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2d16a08_f9f1_4643_ae69_4f51a6ddc316.slice/crio-bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7 WatchSource:0}: Error finding container bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7: Status 404 returned error can't find the container with id bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7 Mar 17 02:00:01 crc kubenswrapper[4755]: I0317 02:00:01.141021 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-qc4ld"] Mar 17 02:00:01 crc kubenswrapper[4755]: I0317 02:00:01.250534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" event={"ID":"c2d16a08-f9f1-4643-ae69-4f51a6ddc316","Type":"ContainerStarted","Data":"bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7"} Mar 17 02:00:01 crc kubenswrapper[4755]: I0317 02:00:01.252539 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" event={"ID":"4c179a95-a9d8-47f1-8710-e596da586065","Type":"ContainerStarted","Data":"ab8e598e56d7edd0baaae2178a1f7f8b16a927c22091f29c34bd9cb2778af375"} Mar 17 02:00:02 crc kubenswrapper[4755]: I0317 02:00:02.274079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" event={"ID":"4c179a95-a9d8-47f1-8710-e596da586065","Type":"ContainerStarted","Data":"bc8482eb39a4fc35d1e51ee84f6325e003c59e92a0a12c3b1b98eef959b90bf2"} Mar 17 02:00:02 crc kubenswrapper[4755]: I0317 02:00:02.289979 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" podStartSLOduration=2.289959697 podStartE2EDuration="2.289959697s" podCreationTimestamp="2026-03-17 02:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:00:02.285615721 +0000 UTC m=+5877.045068024" watchObservedRunningTime="2026-03-17 02:00:02.289959697 +0000 UTC m=+5877.049411990" Mar 17 02:00:03 crc kubenswrapper[4755]: I0317 02:00:03.278082 4755 generic.go:334] "Generic (PLEG): container finished" podID="4c179a95-a9d8-47f1-8710-e596da586065" containerID="bc8482eb39a4fc35d1e51ee84f6325e003c59e92a0a12c3b1b98eef959b90bf2" exitCode=0 Mar 17 02:00:03 crc kubenswrapper[4755]: I0317 02:00:03.278158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" event={"ID":"4c179a95-a9d8-47f1-8710-e596da586065","Type":"ContainerDied","Data":"bc8482eb39a4fc35d1e51ee84f6325e003c59e92a0a12c3b1b98eef959b90bf2"} Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.274744 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.312101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" event={"ID":"4c179a95-a9d8-47f1-8710-e596da586065","Type":"ContainerDied","Data":"ab8e598e56d7edd0baaae2178a1f7f8b16a927c22091f29c34bd9cb2778af375"} Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.312143 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab8e598e56d7edd0baaae2178a1f7f8b16a927c22091f29c34bd9cb2778af375" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.312227 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.382973 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume\") pod \"4c179a95-a9d8-47f1-8710-e596da586065\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.383181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume\") pod \"4c179a95-a9d8-47f1-8710-e596da586065\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.383514 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm54s\" (UniqueName: \"kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s\") pod \"4c179a95-a9d8-47f1-8710-e596da586065\" (UID: \"4c179a95-a9d8-47f1-8710-e596da586065\") " Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.386045 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume" (OuterVolumeSpecName: "config-volume") pod "4c179a95-a9d8-47f1-8710-e596da586065" (UID: "4c179a95-a9d8-47f1-8710-e596da586065"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.388478 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4c179a95-a9d8-47f1-8710-e596da586065" (UID: "4c179a95-a9d8-47f1-8710-e596da586065"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.395502 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s" (OuterVolumeSpecName: "kube-api-access-xm54s") pod "4c179a95-a9d8-47f1-8710-e596da586065" (UID: "4c179a95-a9d8-47f1-8710-e596da586065"). InnerVolumeSpecName "kube-api-access-xm54s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.487554 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c179a95-a9d8-47f1-8710-e596da586065-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.487752 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c179a95-a9d8-47f1-8710-e596da586065-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:05 crc kubenswrapper[4755]: I0317 02:00:05.487843 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm54s\" (UniqueName: \"kubernetes.io/projected/4c179a95-a9d8-47f1-8710-e596da586065-kube-api-access-xm54s\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:06 crc kubenswrapper[4755]: I0317 02:00:06.380186 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z"] Mar 17 02:00:06 crc kubenswrapper[4755]: I0317 02:00:06.390562 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561835-ck95z"] Mar 17 02:00:06 crc kubenswrapper[4755]: I0317 02:00:06.930628 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqksh" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:06 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:06 crc kubenswrapper[4755]: > Mar 17 02:00:08 crc kubenswrapper[4755]: I0317 02:00:08.260811 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a6317f-9359-4782-ac20-e4315f37a32e" path="/var/lib/kubelet/pods/93a6317f-9359-4782-ac20-e4315f37a32e/volumes" Mar 17 02:00:14 crc kubenswrapper[4755]: I0317 02:00:14.409544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" event={"ID":"c2d16a08-f9f1-4643-ae69-4f51a6ddc316","Type":"ContainerStarted","Data":"c28f5cbaf29665dbe4b1d572db9c796cee7d87059badd7207cc57a4503723093"} Mar 17 02:00:14 crc kubenswrapper[4755]: I0317 02:00:14.431983 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" podStartSLOduration=2.210083082 podStartE2EDuration="14.431964688s" podCreationTimestamp="2026-03-17 02:00:00 +0000 UTC" firstStartedPulling="2026-03-17 02:00:01.153836805 +0000 UTC m=+5875.913289088" lastFinishedPulling="2026-03-17 02:00:13.375718421 +0000 UTC m=+5888.135170694" observedRunningTime="2026-03-17 02:00:14.422047875 +0000 UTC m=+5889.181500158" watchObservedRunningTime="2026-03-17 02:00:14.431964688 +0000 UTC m=+5889.191416971" Mar 17 02:00:15 crc kubenswrapper[4755]: I0317 02:00:15.427326 4755 generic.go:334] "Generic (PLEG): container finished" podID="c2d16a08-f9f1-4643-ae69-4f51a6ddc316" containerID="c28f5cbaf29665dbe4b1d572db9c796cee7d87059badd7207cc57a4503723093" exitCode=0 Mar 17 02:00:15 crc kubenswrapper[4755]: I0317 02:00:15.427423 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" event={"ID":"c2d16a08-f9f1-4643-ae69-4f51a6ddc316","Type":"ContainerDied","Data":"c28f5cbaf29665dbe4b1d572db9c796cee7d87059badd7207cc57a4503723093"} Mar 17 02:00:16 crc kubenswrapper[4755]: I0317 02:00:16.886329 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.078993 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxkbl\" (UniqueName: \"kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl\") pod \"c2d16a08-f9f1-4643-ae69-4f51a6ddc316\" (UID: \"c2d16a08-f9f1-4643-ae69-4f51a6ddc316\") " Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.085489 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl" (OuterVolumeSpecName: "kube-api-access-vxkbl") pod "c2d16a08-f9f1-4643-ae69-4f51a6ddc316" (UID: "c2d16a08-f9f1-4643-ae69-4f51a6ddc316"). InnerVolumeSpecName "kube-api-access-vxkbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.183112 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxkbl\" (UniqueName: \"kubernetes.io/projected/c2d16a08-f9f1-4643-ae69-4f51a6ddc316-kube-api-access-vxkbl\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.400918 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqksh" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" probeResult="failure" output=< Mar 17 02:00:17 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:00:17 crc kubenswrapper[4755]: > Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.464490 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" event={"ID":"c2d16a08-f9f1-4643-ae69-4f51a6ddc316","Type":"ContainerDied","Data":"bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7"} Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.464541 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd84ccb4ae66ac1a3590bf45cfb02f49602d1902335497857b0129320878c0a7" Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.464642 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561880-qc4ld" Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.520962 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-p8vfx"] Mar 17 02:00:17 crc kubenswrapper[4755]: I0317 02:00:17.532504 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561874-p8vfx"] Mar 17 02:00:18 crc kubenswrapper[4755]: I0317 02:00:18.259113 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d25cca-1c6f-4014-8e93-67fecd44920f" path="/var/lib/kubelet/pods/68d25cca-1c6f-4014-8e93-67fecd44920f/volumes" Mar 17 02:00:25 crc kubenswrapper[4755]: I0317 02:00:25.937768 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 02:00:26 crc kubenswrapper[4755]: I0317 02:00:26.035949 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 02:00:26 crc kubenswrapper[4755]: I0317 02:00:26.201209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 02:00:27 crc kubenswrapper[4755]: I0317 02:00:27.706050 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mqksh" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" containerID="cri-o://4c1b960a91c40e446d75742ac3fb1021463c7f55d8f6f342bfc6ec31a062cedb" gracePeriod=2 Mar 17 02:00:28 crc kubenswrapper[4755]: I0317 02:00:28.665697 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:00:28 crc kubenswrapper[4755]: I0317 02:00:28.665959 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:00:28 crc kubenswrapper[4755]: I0317 02:00:28.718205 4755 generic.go:334] "Generic (PLEG): container finished" podID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerID="4c1b960a91c40e446d75742ac3fb1021463c7f55d8f6f342bfc6ec31a062cedb" exitCode=0 Mar 17 02:00:28 crc kubenswrapper[4755]: I0317 02:00:28.718253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerDied","Data":"4c1b960a91c40e446d75742ac3fb1021463c7f55d8f6f342bfc6ec31a062cedb"} Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.585992 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.697405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pwpp\" (UniqueName: \"kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp\") pod \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.697614 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities\") pod \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.697848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content\") pod \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\" (UID: \"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e\") " Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.700379 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities" (OuterVolumeSpecName: "utilities") pod "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" (UID: "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.705861 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp" (OuterVolumeSpecName: "kube-api-access-2pwpp") pod "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" (UID: "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e"). InnerVolumeSpecName "kube-api-access-2pwpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.754570 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqksh" event={"ID":"afa97fd1-50cf-4079-9fcf-6c98fa2bca9e","Type":"ContainerDied","Data":"f2a79364f48d5b2bd7560744c8b18b777682de06a01b133b8f1d9516a5d9b5d0"} Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.754622 4755 scope.go:117] "RemoveContainer" containerID="4c1b960a91c40e446d75742ac3fb1021463c7f55d8f6f342bfc6ec31a062cedb" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.754819 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqksh" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.779084 4755 scope.go:117] "RemoveContainer" containerID="37e3cb548b6b98773cb5a3ec114a7e4e05575b25a8ddcc9e22ba702bc02eeb2c" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.799785 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.799817 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pwpp\" (UniqueName: \"kubernetes.io/projected/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-kube-api-access-2pwpp\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.816352 4755 scope.go:117] "RemoveContainer" containerID="2fb3a6544ec2b8fa108d234917e0e8d042a7014d03e0afaf1186e7f388f4cae7" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.838206 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" (UID: "afa97fd1-50cf-4079-9fcf-6c98fa2bca9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:00:29 crc kubenswrapper[4755]: I0317 02:00:29.902373 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:00:30 crc kubenswrapper[4755]: I0317 02:00:30.092860 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 02:00:30 crc kubenswrapper[4755]: I0317 02:00:30.105522 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mqksh"] Mar 17 02:00:30 crc kubenswrapper[4755]: I0317 02:00:30.278368 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" path="/var/lib/kubelet/pods/afa97fd1-50cf-4079-9fcf-6c98fa2bca9e/volumes" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.405515 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 02:00:47 crc kubenswrapper[4755]: E0317 02:00:47.406721 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="extract-content" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.406742 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="extract-content" Mar 17 02:00:47 crc kubenswrapper[4755]: E0317 02:00:47.406795 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="extract-utilities" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.406809 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="extract-utilities" Mar 17 02:00:47 crc kubenswrapper[4755]: E0317 02:00:47.406844 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c179a95-a9d8-47f1-8710-e596da586065" containerName="collect-profiles" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.406857 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c179a95-a9d8-47f1-8710-e596da586065" containerName="collect-profiles" Mar 17 02:00:47 crc kubenswrapper[4755]: E0317 02:00:47.406917 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.406930 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" Mar 17 02:00:47 crc kubenswrapper[4755]: E0317 02:00:47.406944 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d16a08-f9f1-4643-ae69-4f51a6ddc316" containerName="oc" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.406956 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d16a08-f9f1-4643-ae69-4f51a6ddc316" containerName="oc" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.407333 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa97fd1-50cf-4079-9fcf-6c98fa2bca9e" containerName="registry-server" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.407356 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d16a08-f9f1-4643-ae69-4f51a6ddc316" containerName="oc" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.407382 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c179a95-a9d8-47f1-8710-e596da586065" containerName="collect-profiles" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.408800 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.411155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.411859 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.412357 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lcx6m" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.413746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.425050 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.425266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.425366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.425779 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527546 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.527943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwxzg\" (UniqueName: \"kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.528019 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.529315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.530716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630266 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwxzg\" (UniqueName: \"kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.630719 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.631193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.631516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:47 crc kubenswrapper[4755]: I0317 02:00:47.634838 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.057744 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.059899 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.060310 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.067585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwxzg\" (UniqueName: \"kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.145034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.331614 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 02:00:48 crc kubenswrapper[4755]: I0317 02:00:48.896503 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 02:00:49 crc kubenswrapper[4755]: I0317 02:00:49.008470 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d","Type":"ContainerStarted","Data":"215a855e3220f481f185e5b5417ab2f0954c5853d9fe6ef7b7fc477c8b3acc07"} Mar 17 02:00:58 crc kubenswrapper[4755]: I0317 02:00:58.664941 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:00:58 crc kubenswrapper[4755]: I0317 02:00:58.665609 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.162937 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29561881-fpzd5"] Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.168533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.179059 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561881-fpzd5"] Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.364924 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.364987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.365411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpt8\" (UniqueName: \"kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.365502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.467738 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpt8\" (UniqueName: \"kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.467784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.467878 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.467897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.559799 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.563186 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpt8\" (UniqueName: \"kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.571089 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.576578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle\") pod \"keystone-cron-29561881-fpzd5\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:00 crc kubenswrapper[4755]: I0317 02:01:00.809710 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:06 crc kubenswrapper[4755]: I0317 02:01:06.827105 4755 scope.go:117] "RemoveContainer" containerID="debc59f39976a4d9a1da346ac8b2ce9129d3489e99b37af27d558842e1ab5f33" Mar 17 02:01:18 crc kubenswrapper[4755]: I0317 02:01:18.178596 4755 scope.go:117] "RemoveContainer" containerID="ea4fc3441a28807fe50c272d91fe5f43e33cbdbb7710c8a53085f0979ea95670" Mar 17 02:01:18 crc kubenswrapper[4755]: E0317 02:01:18.267604 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 17 02:01:18 crc kubenswrapper[4755]: E0317 02:01:18.272252 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwxzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 02:01:18 crc kubenswrapper[4755]: E0317 02:01:18.274153 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" Mar 17 02:01:18 crc kubenswrapper[4755]: E0317 02:01:18.386925 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" Mar 17 02:01:18 crc kubenswrapper[4755]: I0317 02:01:18.731133 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29561881-fpzd5"] Mar 17 02:01:18 crc kubenswrapper[4755]: W0317 02:01:18.741327 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e541f43_cda6_4951_a0cd_f77cb49018fd.slice/crio-9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0 WatchSource:0}: Error finding container 9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0: Status 404 returned error can't find the container with id 9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0 Mar 17 02:01:19 crc kubenswrapper[4755]: I0317 02:01:19.400087 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-fpzd5" event={"ID":"0e541f43-cda6-4951-a0cd-f77cb49018fd","Type":"ContainerStarted","Data":"bc7ffa377952236e84eb613ae5accb4e0e3e4b2c62ba6035e39554bf21b80407"} Mar 17 02:01:19 crc kubenswrapper[4755]: I0317 02:01:19.400146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-fpzd5" event={"ID":"0e541f43-cda6-4951-a0cd-f77cb49018fd","Type":"ContainerStarted","Data":"9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0"} Mar 17 02:01:19 crc kubenswrapper[4755]: I0317 02:01:19.431112 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29561881-fpzd5" podStartSLOduration=19.431089902 podStartE2EDuration="19.431089902s" podCreationTimestamp="2026-03-17 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:01:19.417154303 +0000 UTC m=+5954.176606596" watchObservedRunningTime="2026-03-17 02:01:19.431089902 +0000 UTC m=+5954.190542205" Mar 17 02:01:23 crc kubenswrapper[4755]: I0317 02:01:23.449311 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e541f43-cda6-4951-a0cd-f77cb49018fd" containerID="bc7ffa377952236e84eb613ae5accb4e0e3e4b2c62ba6035e39554bf21b80407" exitCode=0 Mar 17 02:01:23 crc kubenswrapper[4755]: I0317 02:01:23.450286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-fpzd5" event={"ID":"0e541f43-cda6-4951-a0cd-f77cb49018fd","Type":"ContainerDied","Data":"bc7ffa377952236e84eb613ae5accb4e0e3e4b2c62ba6035e39554bf21b80407"} Mar 17 02:01:24 crc kubenswrapper[4755]: I0317 02:01:24.888323 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:24 crc kubenswrapper[4755]: I0317 02:01:24.990167 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys\") pod \"0e541f43-cda6-4951-a0cd-f77cb49018fd\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " Mar 17 02:01:24 crc kubenswrapper[4755]: I0317 02:01:24.990395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wpt8\" (UniqueName: \"kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8\") pod \"0e541f43-cda6-4951-a0cd-f77cb49018fd\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " Mar 17 02:01:24 crc kubenswrapper[4755]: I0317 02:01:24.990586 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle\") pod \"0e541f43-cda6-4951-a0cd-f77cb49018fd\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " Mar 17 02:01:24 crc kubenswrapper[4755]: I0317 02:01:24.990644 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data\") pod \"0e541f43-cda6-4951-a0cd-f77cb49018fd\" (UID: \"0e541f43-cda6-4951-a0cd-f77cb49018fd\") " Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.000612 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0e541f43-cda6-4951-a0cd-f77cb49018fd" (UID: "0e541f43-cda6-4951-a0cd-f77cb49018fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.001418 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8" (OuterVolumeSpecName: "kube-api-access-7wpt8") pod "0e541f43-cda6-4951-a0cd-f77cb49018fd" (UID: "0e541f43-cda6-4951-a0cd-f77cb49018fd"). InnerVolumeSpecName "kube-api-access-7wpt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.038742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e541f43-cda6-4951-a0cd-f77cb49018fd" (UID: "0e541f43-cda6-4951-a0cd-f77cb49018fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.094485 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.094545 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wpt8\" (UniqueName: \"kubernetes.io/projected/0e541f43-cda6-4951-a0cd-f77cb49018fd-kube-api-access-7wpt8\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.094567 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.105465 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data" (OuterVolumeSpecName: "config-data") pod "0e541f43-cda6-4951-a0cd-f77cb49018fd" (UID: "0e541f43-cda6-4951-a0cd-f77cb49018fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.196736 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e541f43-cda6-4951-a0cd-f77cb49018fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.513337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29561881-fpzd5" event={"ID":"0e541f43-cda6-4951-a0cd-f77cb49018fd","Type":"ContainerDied","Data":"9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0"} Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.513632 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9780280ea77f11cc36b8af2d8e41efa290f4573350bdee43904347aa9ed375d0" Mar 17 02:01:25 crc kubenswrapper[4755]: I0317 02:01:25.513393 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29561881-fpzd5" Mar 17 02:01:28 crc kubenswrapper[4755]: I0317 02:01:28.665911 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:01:28 crc kubenswrapper[4755]: I0317 02:01:28.666671 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:01:28 crc kubenswrapper[4755]: I0317 02:01:28.666735 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:01:28 crc kubenswrapper[4755]: I0317 02:01:28.667817 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:01:28 crc kubenswrapper[4755]: I0317 02:01:28.667888 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" gracePeriod=600 Mar 17 02:01:28 crc kubenswrapper[4755]: E0317 02:01:28.795892 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:01:29 crc kubenswrapper[4755]: I0317 02:01:29.577074 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" exitCode=0 Mar 17 02:01:29 crc kubenswrapper[4755]: I0317 02:01:29.577175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8"} Mar 17 02:01:29 crc kubenswrapper[4755]: I0317 02:01:29.577420 4755 scope.go:117] "RemoveContainer" containerID="3741dc6ea40e99d3d3389ff9c8094526a2e12c8a81893f627dd1c394696b2190" Mar 17 02:01:29 crc kubenswrapper[4755]: I0317 02:01:29.578423 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:01:29 crc kubenswrapper[4755]: E0317 02:01:29.578798 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:01:30 crc kubenswrapper[4755]: I0317 02:01:30.825535 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 02:01:33 crc kubenswrapper[4755]: I0317 02:01:33.637760 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d","Type":"ContainerStarted","Data":"db20154150d812ff9e979bf4b4105a2a23f652443d68ed255fa3cfc56e3220a9"} Mar 17 02:01:33 crc kubenswrapper[4755]: I0317 02:01:33.666013 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.744056746 podStartE2EDuration="47.665987759s" podCreationTimestamp="2026-03-17 02:00:46 +0000 UTC" firstStartedPulling="2026-03-17 02:00:48.899399232 +0000 UTC m=+5923.658851515" lastFinishedPulling="2026-03-17 02:01:30.821330215 +0000 UTC m=+5965.580782528" observedRunningTime="2026-03-17 02:01:33.660900605 +0000 UTC m=+5968.420352928" watchObservedRunningTime="2026-03-17 02:01:33.665987759 +0000 UTC m=+5968.425440082" Mar 17 02:01:43 crc kubenswrapper[4755]: I0317 02:01:43.248246 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:01:43 crc kubenswrapper[4755]: E0317 02:01:43.249143 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:01:55 crc kubenswrapper[4755]: I0317 02:01:55.249651 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:01:55 crc kubenswrapper[4755]: E0317 02:01:55.251246 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.176413 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561882-c7lxm"] Mar 17 02:02:00 crc kubenswrapper[4755]: E0317 02:02:00.177366 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e541f43-cda6-4951-a0cd-f77cb49018fd" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.177378 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e541f43-cda6-4951-a0cd-f77cb49018fd" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.177640 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e541f43-cda6-4951-a0cd-f77cb49018fd" containerName="keystone-cron" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.178379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.181328 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.181709 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.181751 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.199884 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-c7lxm"] Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.339629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzlv\" (UniqueName: \"kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv\") pod \"auto-csr-approver-29561882-c7lxm\" (UID: \"f6e5a5f5-5a38-4a3a-b882-a13045c27110\") " pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.441962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzlv\" (UniqueName: \"kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv\") pod \"auto-csr-approver-29561882-c7lxm\" (UID: \"f6e5a5f5-5a38-4a3a-b882-a13045c27110\") " pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.467228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzlv\" (UniqueName: \"kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv\") pod \"auto-csr-approver-29561882-c7lxm\" (UID: \"f6e5a5f5-5a38-4a3a-b882-a13045c27110\") " pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:00 crc kubenswrapper[4755]: I0317 02:02:00.550606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:01 crc kubenswrapper[4755]: I0317 02:02:01.039779 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-c7lxm"] Mar 17 02:02:02 crc kubenswrapper[4755]: I0317 02:02:02.051657 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" event={"ID":"f6e5a5f5-5a38-4a3a-b882-a13045c27110","Type":"ContainerStarted","Data":"307e7ef8e549b2557c412577457b6576d3e860ea77d1fa2b47bd3334eaff8c51"} Mar 17 02:02:03 crc kubenswrapper[4755]: I0317 02:02:03.061089 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" event={"ID":"f6e5a5f5-5a38-4a3a-b882-a13045c27110","Type":"ContainerStarted","Data":"48e8343704e8f9decd51d4fd1dcc371abcf1629189c6e7e8d04f6097c47f939f"} Mar 17 02:02:03 crc kubenswrapper[4755]: I0317 02:02:03.077413 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" podStartSLOduration=1.827106144 podStartE2EDuration="3.077396058s" podCreationTimestamp="2026-03-17 02:02:00 +0000 UTC" firstStartedPulling="2026-03-17 02:02:01.039292091 +0000 UTC m=+5995.798744394" lastFinishedPulling="2026-03-17 02:02:02.289581985 +0000 UTC m=+5997.049034308" observedRunningTime="2026-03-17 02:02:03.075837776 +0000 UTC m=+5997.835290069" watchObservedRunningTime="2026-03-17 02:02:03.077396058 +0000 UTC m=+5997.836848341" Mar 17 02:02:05 crc kubenswrapper[4755]: I0317 02:02:05.104862 4755 generic.go:334] "Generic (PLEG): container finished" podID="f6e5a5f5-5a38-4a3a-b882-a13045c27110" containerID="48e8343704e8f9decd51d4fd1dcc371abcf1629189c6e7e8d04f6097c47f939f" exitCode=0 Mar 17 02:02:05 crc kubenswrapper[4755]: I0317 02:02:05.104922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" event={"ID":"f6e5a5f5-5a38-4a3a-b882-a13045c27110","Type":"ContainerDied","Data":"48e8343704e8f9decd51d4fd1dcc371abcf1629189c6e7e8d04f6097c47f939f"} Mar 17 02:02:06 crc kubenswrapper[4755]: I0317 02:02:06.560679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:06 crc kubenswrapper[4755]: I0317 02:02:06.715024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzlv\" (UniqueName: \"kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv\") pod \"f6e5a5f5-5a38-4a3a-b882-a13045c27110\" (UID: \"f6e5a5f5-5a38-4a3a-b882-a13045c27110\") " Mar 17 02:02:06 crc kubenswrapper[4755]: I0317 02:02:06.721724 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv" (OuterVolumeSpecName: "kube-api-access-mmzlv") pod "f6e5a5f5-5a38-4a3a-b882-a13045c27110" (UID: "f6e5a5f5-5a38-4a3a-b882-a13045c27110"). InnerVolumeSpecName "kube-api-access-mmzlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:02:06 crc kubenswrapper[4755]: I0317 02:02:06.818770 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzlv\" (UniqueName: \"kubernetes.io/projected/f6e5a5f5-5a38-4a3a-b882-a13045c27110-kube-api-access-mmzlv\") on node \"crc\" DevicePath \"\"" Mar 17 02:02:07 crc kubenswrapper[4755]: I0317 02:02:07.126729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" event={"ID":"f6e5a5f5-5a38-4a3a-b882-a13045c27110","Type":"ContainerDied","Data":"307e7ef8e549b2557c412577457b6576d3e860ea77d1fa2b47bd3334eaff8c51"} Mar 17 02:02:07 crc kubenswrapper[4755]: I0317 02:02:07.126768 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="307e7ef8e549b2557c412577457b6576d3e860ea77d1fa2b47bd3334eaff8c51" Mar 17 02:02:07 crc kubenswrapper[4755]: I0317 02:02:07.126789 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561882-c7lxm" Mar 17 02:02:07 crc kubenswrapper[4755]: I0317 02:02:07.195908 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-sh2s9"] Mar 17 02:02:07 crc kubenswrapper[4755]: I0317 02:02:07.210251 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561876-sh2s9"] Mar 17 02:02:08 crc kubenswrapper[4755]: I0317 02:02:08.290151 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f9a408-67e3-4bc3-9638-46fe78f7757f" path="/var/lib/kubelet/pods/54f9a408-67e3-4bc3-9638-46fe78f7757f/volumes" Mar 17 02:02:09 crc kubenswrapper[4755]: I0317 02:02:09.249955 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:02:09 crc kubenswrapper[4755]: E0317 02:02:09.250825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:02:18 crc kubenswrapper[4755]: I0317 02:02:18.497297 4755 scope.go:117] "RemoveContainer" containerID="5a29debe2a5d275ca8e1d91f7aadb8593d31a174918d0bb8d40ecc7fee109552" Mar 17 02:02:23 crc kubenswrapper[4755]: I0317 02:02:23.248962 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:02:23 crc kubenswrapper[4755]: E0317 02:02:23.249957 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:02:35 crc kubenswrapper[4755]: I0317 02:02:35.248476 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:02:35 crc kubenswrapper[4755]: E0317 02:02:35.249347 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:02:46 crc kubenswrapper[4755]: I0317 02:02:46.262868 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:02:46 crc kubenswrapper[4755]: E0317 02:02:46.263693 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:03:00 crc kubenswrapper[4755]: I0317 02:03:00.248546 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:03:00 crc kubenswrapper[4755]: E0317 02:03:00.249272 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:03:14 crc kubenswrapper[4755]: I0317 02:03:14.249327 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:03:14 crc kubenswrapper[4755]: E0317 02:03:14.249944 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:03:28 crc kubenswrapper[4755]: I0317 02:03:28.248796 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:03:28 crc kubenswrapper[4755]: E0317 02:03:28.249590 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:03:43 crc kubenswrapper[4755]: I0317 02:03:43.247773 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:03:43 crc kubenswrapper[4755]: E0317 02:03:43.250311 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:03:56 crc kubenswrapper[4755]: I0317 02:03:56.257556 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:03:56 crc kubenswrapper[4755]: E0317 02:03:56.258480 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.233027 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561884-kqm5w"] Mar 17 02:04:00 crc kubenswrapper[4755]: E0317 02:04:00.234033 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e5a5f5-5a38-4a3a-b882-a13045c27110" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.234050 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e5a5f5-5a38-4a3a-b882-a13045c27110" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.234345 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e5a5f5-5a38-4a3a-b882-a13045c27110" containerName="oc" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.235260 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.238183 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.238593 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.238777 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.246753 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-kqm5w"] Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.355094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxt7p\" (UniqueName: \"kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p\") pod \"auto-csr-approver-29561884-kqm5w\" (UID: \"db6aa5b5-b2a0-49b9-bc43-96971c49e39e\") " pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.457401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxt7p\" (UniqueName: \"kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p\") pod \"auto-csr-approver-29561884-kqm5w\" (UID: \"db6aa5b5-b2a0-49b9-bc43-96971c49e39e\") " pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.509531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxt7p\" (UniqueName: \"kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p\") pod \"auto-csr-approver-29561884-kqm5w\" (UID: \"db6aa5b5-b2a0-49b9-bc43-96971c49e39e\") " pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:00 crc kubenswrapper[4755]: I0317 02:04:00.557641 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:01 crc kubenswrapper[4755]: I0317 02:04:01.726728 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-kqm5w"] Mar 17 02:04:02 crc kubenswrapper[4755]: I0317 02:04:02.355327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" event={"ID":"db6aa5b5-b2a0-49b9-bc43-96971c49e39e","Type":"ContainerStarted","Data":"eeffb22073cdcde8d331fb5be3b491cc3aee8d7bd4aeafd80a6c5bbbfccef1a3"} Mar 17 02:04:04 crc kubenswrapper[4755]: I0317 02:04:04.377154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" event={"ID":"db6aa5b5-b2a0-49b9-bc43-96971c49e39e","Type":"ContainerStarted","Data":"adf3874ac3881dc28eff51caf7e92876b6ca9d483ce37aef855615ef6a5b3ddf"} Mar 17 02:04:04 crc kubenswrapper[4755]: I0317 02:04:04.412116 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" podStartSLOduration=3.53139463 podStartE2EDuration="4.41209468s" podCreationTimestamp="2026-03-17 02:04:00 +0000 UTC" firstStartedPulling="2026-03-17 02:04:01.720124728 +0000 UTC m=+6116.479577011" lastFinishedPulling="2026-03-17 02:04:02.600824778 +0000 UTC m=+6117.360277061" observedRunningTime="2026-03-17 02:04:04.392165062 +0000 UTC m=+6119.151617345" watchObservedRunningTime="2026-03-17 02:04:04.41209468 +0000 UTC m=+6119.171546963" Mar 17 02:04:05 crc kubenswrapper[4755]: I0317 02:04:05.386049 4755 generic.go:334] "Generic (PLEG): container finished" podID="db6aa5b5-b2a0-49b9-bc43-96971c49e39e" containerID="adf3874ac3881dc28eff51caf7e92876b6ca9d483ce37aef855615ef6a5b3ddf" exitCode=0 Mar 17 02:04:05 crc kubenswrapper[4755]: I0317 02:04:05.386158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" event={"ID":"db6aa5b5-b2a0-49b9-bc43-96971c49e39e","Type":"ContainerDied","Data":"adf3874ac3881dc28eff51caf7e92876b6ca9d483ce37aef855615ef6a5b3ddf"} Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.053193 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.112694 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxt7p\" (UniqueName: \"kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p\") pod \"db6aa5b5-b2a0-49b9-bc43-96971c49e39e\" (UID: \"db6aa5b5-b2a0-49b9-bc43-96971c49e39e\") " Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.152698 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p" (OuterVolumeSpecName: "kube-api-access-qxt7p") pod "db6aa5b5-b2a0-49b9-bc43-96971c49e39e" (UID: "db6aa5b5-b2a0-49b9-bc43-96971c49e39e"). InnerVolumeSpecName "kube-api-access-qxt7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.216675 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxt7p\" (UniqueName: \"kubernetes.io/projected/db6aa5b5-b2a0-49b9-bc43-96971c49e39e-kube-api-access-qxt7p\") on node \"crc\" DevicePath \"\"" Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.406177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" event={"ID":"db6aa5b5-b2a0-49b9-bc43-96971c49e39e","Type":"ContainerDied","Data":"eeffb22073cdcde8d331fb5be3b491cc3aee8d7bd4aeafd80a6c5bbbfccef1a3"} Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.406216 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeffb22073cdcde8d331fb5be3b491cc3aee8d7bd4aeafd80a6c5bbbfccef1a3" Mar 17 02:04:07 crc kubenswrapper[4755]: I0317 02:04:07.406232 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561884-kqm5w" Mar 17 02:04:08 crc kubenswrapper[4755]: I0317 02:04:08.149196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-8dp55"] Mar 17 02:04:08 crc kubenswrapper[4755]: I0317 02:04:08.167564 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561878-8dp55"] Mar 17 02:04:08 crc kubenswrapper[4755]: I0317 02:04:08.261858 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c07e080-a2ca-4991-b83a-3c7f40325574" path="/var/lib/kubelet/pods/9c07e080-a2ca-4991-b83a-3c7f40325574/volumes" Mar 17 02:04:09 crc kubenswrapper[4755]: I0317 02:04:09.248753 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:04:09 crc kubenswrapper[4755]: E0317 02:04:09.249372 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:04:18 crc kubenswrapper[4755]: I0317 02:04:18.707858 4755 scope.go:117] "RemoveContainer" containerID="8619e1e5097e43b0e53fd0378cdcd7c4e1a661bb365e1b56723ccb4fefcc046a" Mar 17 02:04:23 crc kubenswrapper[4755]: I0317 02:04:23.248373 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:04:23 crc kubenswrapper[4755]: E0317 02:04:23.249203 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:04:34 crc kubenswrapper[4755]: I0317 02:04:34.248701 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:04:34 crc kubenswrapper[4755]: E0317 02:04:34.249470 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.460385 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:45 crc kubenswrapper[4755]: E0317 02:04:45.461400 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6aa5b5-b2a0-49b9-bc43-96971c49e39e" containerName="oc" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.461414 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6aa5b5-b2a0-49b9-bc43-96971c49e39e" containerName="oc" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.461637 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6aa5b5-b2a0-49b9-bc43-96971c49e39e" containerName="oc" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.463355 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.485738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.577825 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.577901 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.578160 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4jf4\" (UniqueName: \"kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.680279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.680376 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4jf4\" (UniqueName: \"kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.680522 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.680956 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.680974 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.707793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4jf4\" (UniqueName: \"kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4\") pod \"certified-operators-6jr92\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:45 crc kubenswrapper[4755]: I0317 02:04:45.787073 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:46 crc kubenswrapper[4755]: I0317 02:04:46.320373 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:46 crc kubenswrapper[4755]: I0317 02:04:46.868396 4755 generic.go:334] "Generic (PLEG): container finished" podID="40279090-267f-429e-b7d8-6b96b785302e" containerID="a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7" exitCode=0 Mar 17 02:04:46 crc kubenswrapper[4755]: I0317 02:04:46.868578 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerDied","Data":"a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7"} Mar 17 02:04:46 crc kubenswrapper[4755]: I0317 02:04:46.870032 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerStarted","Data":"bdde186f7edcfda26678579ace634c9142aae62c292de068954ece2e3d55f74f"} Mar 17 02:04:47 crc kubenswrapper[4755]: I0317 02:04:47.883062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerStarted","Data":"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3"} Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.231311 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.234610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.243193 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.342823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.343046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkd2l\" (UniqueName: \"kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.343433 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.445353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkd2l\" (UniqueName: \"kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.445553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.445695 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.446590 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.446649 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.465302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkd2l\" (UniqueName: \"kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l\") pod \"redhat-marketplace-ln495\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:48 crc kubenswrapper[4755]: I0317 02:04:48.580945 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.026562 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:04:49 crc kubenswrapper[4755]: W0317 02:04:49.030776 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711f51a1_99cb_4b6b_aa3e_28a0244bb70b.slice/crio-cf4d8e21e4ddc7b808fec15a3168325bb11ea1698d6724f4fc5309fe588f436b WatchSource:0}: Error finding container cf4d8e21e4ddc7b808fec15a3168325bb11ea1698d6724f4fc5309fe588f436b: Status 404 returned error can't find the container with id cf4d8e21e4ddc7b808fec15a3168325bb11ea1698d6724f4fc5309fe588f436b Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.248391 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:04:49 crc kubenswrapper[4755]: E0317 02:04:49.248789 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.902354 4755 generic.go:334] "Generic (PLEG): container finished" podID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerID="b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4" exitCode=0 Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.902642 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerDied","Data":"b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4"} Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.903157 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerStarted","Data":"cf4d8e21e4ddc7b808fec15a3168325bb11ea1698d6724f4fc5309fe588f436b"} Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.906362 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.911291 4755 generic.go:334] "Generic (PLEG): container finished" podID="40279090-267f-429e-b7d8-6b96b785302e" containerID="d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3" exitCode=0 Mar 17 02:04:49 crc kubenswrapper[4755]: I0317 02:04:49.911334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerDied","Data":"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3"} Mar 17 02:04:50 crc kubenswrapper[4755]: I0317 02:04:50.926354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerStarted","Data":"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178"} Mar 17 02:04:50 crc kubenswrapper[4755]: I0317 02:04:50.928377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerStarted","Data":"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120"} Mar 17 02:04:50 crc kubenswrapper[4755]: I0317 02:04:50.951004 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6jr92" podStartSLOduration=2.483852687 podStartE2EDuration="5.950983217s" podCreationTimestamp="2026-03-17 02:04:45 +0000 UTC" firstStartedPulling="2026-03-17 02:04:46.872334291 +0000 UTC m=+6161.631786584" lastFinishedPulling="2026-03-17 02:04:50.339464831 +0000 UTC m=+6165.098917114" observedRunningTime="2026-03-17 02:04:50.948920552 +0000 UTC m=+6165.708372845" watchObservedRunningTime="2026-03-17 02:04:50.950983217 +0000 UTC m=+6165.710435500" Mar 17 02:04:52 crc kubenswrapper[4755]: I0317 02:04:52.955146 4755 generic.go:334] "Generic (PLEG): container finished" podID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerID="17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120" exitCode=0 Mar 17 02:04:52 crc kubenswrapper[4755]: I0317 02:04:52.955210 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerDied","Data":"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120"} Mar 17 02:04:53 crc kubenswrapper[4755]: I0317 02:04:53.967554 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerStarted","Data":"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a"} Mar 17 02:04:53 crc kubenswrapper[4755]: I0317 02:04:53.989248 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ln495" podStartSLOduration=2.526793019 podStartE2EDuration="5.989229755s" podCreationTimestamp="2026-03-17 02:04:48 +0000 UTC" firstStartedPulling="2026-03-17 02:04:49.905936386 +0000 UTC m=+6164.665388679" lastFinishedPulling="2026-03-17 02:04:53.368373122 +0000 UTC m=+6168.127825415" observedRunningTime="2026-03-17 02:04:53.986705468 +0000 UTC m=+6168.746157751" watchObservedRunningTime="2026-03-17 02:04:53.989229755 +0000 UTC m=+6168.748682038" Mar 17 02:04:55 crc kubenswrapper[4755]: I0317 02:04:55.787252 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:55 crc kubenswrapper[4755]: I0317 02:04:55.787314 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:55 crc kubenswrapper[4755]: I0317 02:04:55.836223 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:56 crc kubenswrapper[4755]: I0317 02:04:56.070861 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:57 crc kubenswrapper[4755]: I0317 02:04:57.022771 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.014923 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6jr92" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="registry-server" containerID="cri-o://d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178" gracePeriod=2 Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.581788 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.582048 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.659016 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.792770 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities\") pod \"40279090-267f-429e-b7d8-6b96b785302e\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.792837 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4jf4\" (UniqueName: \"kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4\") pod \"40279090-267f-429e-b7d8-6b96b785302e\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.792966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content\") pod \"40279090-267f-429e-b7d8-6b96b785302e\" (UID: \"40279090-267f-429e-b7d8-6b96b785302e\") " Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.794132 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities" (OuterVolumeSpecName: "utilities") pod "40279090-267f-429e-b7d8-6b96b785302e" (UID: "40279090-267f-429e-b7d8-6b96b785302e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.803783 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4" (OuterVolumeSpecName: "kube-api-access-m4jf4") pod "40279090-267f-429e-b7d8-6b96b785302e" (UID: "40279090-267f-429e-b7d8-6b96b785302e"). InnerVolumeSpecName "kube-api-access-m4jf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.848554 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40279090-267f-429e-b7d8-6b96b785302e" (UID: "40279090-267f-429e-b7d8-6b96b785302e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.895145 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.895183 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4jf4\" (UniqueName: \"kubernetes.io/projected/40279090-267f-429e-b7d8-6b96b785302e-kube-api-access-m4jf4\") on node \"crc\" DevicePath \"\"" Mar 17 02:04:58 crc kubenswrapper[4755]: I0317 02:04:58.895196 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40279090-267f-429e-b7d8-6b96b785302e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.026749 4755 generic.go:334] "Generic (PLEG): container finished" podID="40279090-267f-429e-b7d8-6b96b785302e" containerID="d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178" exitCode=0 Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.026800 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerDied","Data":"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178"} Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.026834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr92" event={"ID":"40279090-267f-429e-b7d8-6b96b785302e","Type":"ContainerDied","Data":"bdde186f7edcfda26678579ace634c9142aae62c292de068954ece2e3d55f74f"} Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.026855 4755 scope.go:117] "RemoveContainer" containerID="d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.027093 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr92" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.049933 4755 scope.go:117] "RemoveContainer" containerID="d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.065799 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.077014 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6jr92"] Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.085879 4755 scope.go:117] "RemoveContainer" containerID="a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.139590 4755 scope.go:117] "RemoveContainer" containerID="d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178" Mar 17 02:04:59 crc kubenswrapper[4755]: E0317 02:04:59.140513 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178\": container with ID starting with d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178 not found: ID does not exist" containerID="d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.140544 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178"} err="failed to get container status \"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178\": rpc error: code = NotFound desc = could not find container \"d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178\": container with ID starting with d54086b1935b0a3fe1a8def757cfd3610870eb31fb79c76ddc7a6055afba8178 not found: ID does not exist" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.140565 4755 scope.go:117] "RemoveContainer" containerID="d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3" Mar 17 02:04:59 crc kubenswrapper[4755]: E0317 02:04:59.140818 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3\": container with ID starting with d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3 not found: ID does not exist" containerID="d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.140922 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3"} err="failed to get container status \"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3\": rpc error: code = NotFound desc = could not find container \"d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3\": container with ID starting with d0354b1dd59b421e363342008772d078cf69c57bdb8382a58a2e1f55fabd54a3 not found: ID does not exist" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.141017 4755 scope.go:117] "RemoveContainer" containerID="a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7" Mar 17 02:04:59 crc kubenswrapper[4755]: E0317 02:04:59.141388 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7\": container with ID starting with a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7 not found: ID does not exist" containerID="a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.141536 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7"} err="failed to get container status \"a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7\": rpc error: code = NotFound desc = could not find container \"a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7\": container with ID starting with a17ab06bd167b3adede7a6a8c497ebcbccfd82201b5334bc107f5c2ba0f009b7 not found: ID does not exist" Mar 17 02:04:59 crc kubenswrapper[4755]: I0317 02:04:59.645291 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ln495" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="registry-server" probeResult="failure" output=< Mar 17 02:04:59 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:04:59 crc kubenswrapper[4755]: > Mar 17 02:05:00 crc kubenswrapper[4755]: I0317 02:05:00.263941 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40279090-267f-429e-b7d8-6b96b785302e" path="/var/lib/kubelet/pods/40279090-267f-429e-b7d8-6b96b785302e/volumes" Mar 17 02:05:02 crc kubenswrapper[4755]: I0317 02:05:02.248955 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:05:02 crc kubenswrapper[4755]: E0317 02:05:02.249865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:05:08 crc kubenswrapper[4755]: I0317 02:05:08.657849 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:05:08 crc kubenswrapper[4755]: I0317 02:05:08.719791 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:05:08 crc kubenswrapper[4755]: I0317 02:05:08.904501 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.177413 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ln495" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="registry-server" containerID="cri-o://85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a" gracePeriod=2 Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.869321 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.901669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities" (OuterVolumeSpecName: "utilities") pod "711f51a1-99cb-4b6b-aa3e-28a0244bb70b" (UID: "711f51a1-99cb-4b6b-aa3e-28a0244bb70b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.901916 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities\") pod \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.902140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkd2l\" (UniqueName: \"kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l\") pod \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.902388 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content\") pod \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\" (UID: \"711f51a1-99cb-4b6b-aa3e-28a0244bb70b\") " Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.903061 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.915025 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l" (OuterVolumeSpecName: "kube-api-access-pkd2l") pod "711f51a1-99cb-4b6b-aa3e-28a0244bb70b" (UID: "711f51a1-99cb-4b6b-aa3e-28a0244bb70b"). InnerVolumeSpecName "kube-api-access-pkd2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:05:10 crc kubenswrapper[4755]: I0317 02:05:10.941392 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "711f51a1-99cb-4b6b-aa3e-28a0244bb70b" (UID: "711f51a1-99cb-4b6b-aa3e-28a0244bb70b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.006469 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.006504 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkd2l\" (UniqueName: \"kubernetes.io/projected/711f51a1-99cb-4b6b-aa3e-28a0244bb70b-kube-api-access-pkd2l\") on node \"crc\" DevicePath \"\"" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.193219 4755 generic.go:334] "Generic (PLEG): container finished" podID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerID="85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a" exitCode=0 Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.193260 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerDied","Data":"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a"} Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.193286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ln495" event={"ID":"711f51a1-99cb-4b6b-aa3e-28a0244bb70b","Type":"ContainerDied","Data":"cf4d8e21e4ddc7b808fec15a3168325bb11ea1698d6724f4fc5309fe588f436b"} Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.193302 4755 scope.go:117] "RemoveContainer" containerID="85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.193389 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ln495" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.233058 4755 scope.go:117] "RemoveContainer" containerID="17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.257501 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.266318 4755 scope.go:117] "RemoveContainer" containerID="b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.269173 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ln495"] Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.332076 4755 scope.go:117] "RemoveContainer" containerID="85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a" Mar 17 02:05:11 crc kubenswrapper[4755]: E0317 02:05:11.332603 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a\": container with ID starting with 85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a not found: ID does not exist" containerID="85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.332650 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a"} err="failed to get container status \"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a\": rpc error: code = NotFound desc = could not find container \"85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a\": container with ID starting with 85a49550c7ea4c94ed04be7405b5e59a7c28857d189c90085e8d1f2b8d5c538a not found: ID does not exist" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.332678 4755 scope.go:117] "RemoveContainer" containerID="17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120" Mar 17 02:05:11 crc kubenswrapper[4755]: E0317 02:05:11.333057 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120\": container with ID starting with 17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120 not found: ID does not exist" containerID="17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.333131 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120"} err="failed to get container status \"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120\": rpc error: code = NotFound desc = could not find container \"17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120\": container with ID starting with 17256325f9ab5a2e332d8255264a1c8e4c750c660554fe03d2a33293d961b120 not found: ID does not exist" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.333173 4755 scope.go:117] "RemoveContainer" containerID="b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4" Mar 17 02:05:11 crc kubenswrapper[4755]: E0317 02:05:11.333479 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4\": container with ID starting with b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4 not found: ID does not exist" containerID="b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4" Mar 17 02:05:11 crc kubenswrapper[4755]: I0317 02:05:11.333500 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4"} err="failed to get container status \"b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4\": rpc error: code = NotFound desc = could not find container \"b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4\": container with ID starting with b4500f380ec42788f4f8d5eb7eca915ced3489d339497df6bcbed13bac9699c4 not found: ID does not exist" Mar 17 02:05:12 crc kubenswrapper[4755]: I0317 02:05:12.264378 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" path="/var/lib/kubelet/pods/711f51a1-99cb-4b6b-aa3e-28a0244bb70b/volumes" Mar 17 02:05:14 crc kubenswrapper[4755]: I0317 02:05:14.248791 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:05:14 crc kubenswrapper[4755]: E0317 02:05:14.250505 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:05:26 crc kubenswrapper[4755]: I0317 02:05:26.265259 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:05:26 crc kubenswrapper[4755]: E0317 02:05:26.266553 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:05:37 crc kubenswrapper[4755]: I0317 02:05:37.248637 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:05:37 crc kubenswrapper[4755]: E0317 02:05:37.249776 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:05:51 crc kubenswrapper[4755]: I0317 02:05:51.248927 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:05:51 crc kubenswrapper[4755]: E0317 02:05:51.249732 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.171871 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561886-kr77g"] Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172846 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="extract-content" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172861 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="extract-content" Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172884 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="extract-content" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172894 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="extract-content" Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172914 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172922 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172939 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="extract-utilities" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172947 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="extract-utilities" Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172964 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172972 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: E0317 02:06:00.172982 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="extract-utilities" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.172990 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="extract-utilities" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.173228 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="711f51a1-99cb-4b6b-aa3e-28a0244bb70b" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.173251 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="40279090-267f-429e-b7d8-6b96b785302e" containerName="registry-server" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.174304 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.177524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.177679 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.177910 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.184965 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-kr77g"] Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.233867 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t8l\" (UniqueName: \"kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l\") pod \"auto-csr-approver-29561886-kr77g\" (UID: \"20600ea8-196c-4d9b-9a44-69bbe24b80d6\") " pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.336463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t8l\" (UniqueName: \"kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l\") pod \"auto-csr-approver-29561886-kr77g\" (UID: \"20600ea8-196c-4d9b-9a44-69bbe24b80d6\") " pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.379000 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t8l\" (UniqueName: \"kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l\") pod \"auto-csr-approver-29561886-kr77g\" (UID: \"20600ea8-196c-4d9b-9a44-69bbe24b80d6\") " pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:00 crc kubenswrapper[4755]: I0317 02:06:00.493188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:01 crc kubenswrapper[4755]: I0317 02:06:01.049983 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-kr77g"] Mar 17 02:06:01 crc kubenswrapper[4755]: W0317 02:06:01.061466 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20600ea8_196c_4d9b_9a44_69bbe24b80d6.slice/crio-2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7 WatchSource:0}: Error finding container 2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7: Status 404 returned error can't find the container with id 2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7 Mar 17 02:06:01 crc kubenswrapper[4755]: I0317 02:06:01.875200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-kr77g" event={"ID":"20600ea8-196c-4d9b-9a44-69bbe24b80d6","Type":"ContainerStarted","Data":"2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7"} Mar 17 02:06:02 crc kubenswrapper[4755]: I0317 02:06:02.886777 4755 generic.go:334] "Generic (PLEG): container finished" podID="20600ea8-196c-4d9b-9a44-69bbe24b80d6" containerID="d085afa52cfd68068d6d537a2fa29d597ba880b03f7a11ea714bc0839f3a86e7" exitCode=0 Mar 17 02:06:02 crc kubenswrapper[4755]: I0317 02:06:02.887017 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-kr77g" event={"ID":"20600ea8-196c-4d9b-9a44-69bbe24b80d6","Type":"ContainerDied","Data":"d085afa52cfd68068d6d537a2fa29d597ba880b03f7a11ea714bc0839f3a86e7"} Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.449816 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.548855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7t8l\" (UniqueName: \"kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l\") pod \"20600ea8-196c-4d9b-9a44-69bbe24b80d6\" (UID: \"20600ea8-196c-4d9b-9a44-69bbe24b80d6\") " Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.557976 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l" (OuterVolumeSpecName: "kube-api-access-d7t8l") pod "20600ea8-196c-4d9b-9a44-69bbe24b80d6" (UID: "20600ea8-196c-4d9b-9a44-69bbe24b80d6"). InnerVolumeSpecName "kube-api-access-d7t8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.652289 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7t8l\" (UniqueName: \"kubernetes.io/projected/20600ea8-196c-4d9b-9a44-69bbe24b80d6-kube-api-access-d7t8l\") on node \"crc\" DevicePath \"\"" Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.916197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561886-kr77g" event={"ID":"20600ea8-196c-4d9b-9a44-69bbe24b80d6","Type":"ContainerDied","Data":"2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7"} Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.916284 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561886-kr77g" Mar 17 02:06:04 crc kubenswrapper[4755]: I0317 02:06:04.916301 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ba30a3aec48c4ab281ce97140977b0c0e1d2e0e68d1931cdf648a7b409313d7" Mar 17 02:06:05 crc kubenswrapper[4755]: I0317 02:06:05.546817 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-qc4ld"] Mar 17 02:06:05 crc kubenswrapper[4755]: I0317 02:06:05.556791 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561880-qc4ld"] Mar 17 02:06:06 crc kubenswrapper[4755]: I0317 02:06:06.255860 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:06:06 crc kubenswrapper[4755]: E0317 02:06:06.256388 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:06:06 crc kubenswrapper[4755]: I0317 02:06:06.272841 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d16a08-f9f1-4643-ae69-4f51a6ddc316" path="/var/lib/kubelet/pods/c2d16a08-f9f1-4643-ae69-4f51a6ddc316/volumes" Mar 17 02:06:18 crc kubenswrapper[4755]: I0317 02:06:18.248772 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:06:18 crc kubenswrapper[4755]: E0317 02:06:18.249807 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:06:18 crc kubenswrapper[4755]: I0317 02:06:18.904090 4755 scope.go:117] "RemoveContainer" containerID="c28f5cbaf29665dbe4b1d572db9c796cee7d87059badd7207cc57a4503723093" Mar 17 02:06:33 crc kubenswrapper[4755]: I0317 02:06:33.249006 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:06:34 crc kubenswrapper[4755]: I0317 02:06:34.276770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66"} Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.152829 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561888-jtgqr"] Mar 17 02:08:00 crc kubenswrapper[4755]: E0317 02:08:00.154419 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20600ea8-196c-4d9b-9a44-69bbe24b80d6" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.154493 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="20600ea8-196c-4d9b-9a44-69bbe24b80d6" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.155052 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="20600ea8-196c-4d9b-9a44-69bbe24b80d6" containerName="oc" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.156646 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.189879 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.189990 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.189997 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.198547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-jtgqr"] Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.324634 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfndz\" (UniqueName: \"kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz\") pod \"auto-csr-approver-29561888-jtgqr\" (UID: \"ca93d3a0-f76a-46b1-a82a-8be8d6210d90\") " pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.427551 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfndz\" (UniqueName: \"kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz\") pod \"auto-csr-approver-29561888-jtgqr\" (UID: \"ca93d3a0-f76a-46b1-a82a-8be8d6210d90\") " pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.447765 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfndz\" (UniqueName: \"kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz\") pod \"auto-csr-approver-29561888-jtgqr\" (UID: \"ca93d3a0-f76a-46b1-a82a-8be8d6210d90\") " pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:00 crc kubenswrapper[4755]: I0317 02:08:00.506833 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:01 crc kubenswrapper[4755]: I0317 02:08:01.030241 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-jtgqr"] Mar 17 02:08:01 crc kubenswrapper[4755]: I0317 02:08:01.379195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" event={"ID":"ca93d3a0-f76a-46b1-a82a-8be8d6210d90","Type":"ContainerStarted","Data":"c0f8bbf1fbf7f6aadbb44d4ad765a5634cb207b6144a847c7f585e96077db265"} Mar 17 02:08:03 crc kubenswrapper[4755]: I0317 02:08:03.415388 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca93d3a0-f76a-46b1-a82a-8be8d6210d90" containerID="f8641fc3a2d6dd11c4ceced17e31d6a6e17a98a89a544ddd177fbb424fb02a62" exitCode=0 Mar 17 02:08:03 crc kubenswrapper[4755]: I0317 02:08:03.415903 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" event={"ID":"ca93d3a0-f76a-46b1-a82a-8be8d6210d90","Type":"ContainerDied","Data":"f8641fc3a2d6dd11c4ceced17e31d6a6e17a98a89a544ddd177fbb424fb02a62"} Mar 17 02:08:04 crc kubenswrapper[4755]: I0317 02:08:04.917626 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:04 crc kubenswrapper[4755]: I0317 02:08:04.963186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfndz\" (UniqueName: \"kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz\") pod \"ca93d3a0-f76a-46b1-a82a-8be8d6210d90\" (UID: \"ca93d3a0-f76a-46b1-a82a-8be8d6210d90\") " Mar 17 02:08:04 crc kubenswrapper[4755]: I0317 02:08:04.981714 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz" (OuterVolumeSpecName: "kube-api-access-rfndz") pod "ca93d3a0-f76a-46b1-a82a-8be8d6210d90" (UID: "ca93d3a0-f76a-46b1-a82a-8be8d6210d90"). InnerVolumeSpecName "kube-api-access-rfndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:08:05 crc kubenswrapper[4755]: I0317 02:08:05.066163 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfndz\" (UniqueName: \"kubernetes.io/projected/ca93d3a0-f76a-46b1-a82a-8be8d6210d90-kube-api-access-rfndz\") on node \"crc\" DevicePath \"\"" Mar 17 02:08:05 crc kubenswrapper[4755]: I0317 02:08:05.446102 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" event={"ID":"ca93d3a0-f76a-46b1-a82a-8be8d6210d90","Type":"ContainerDied","Data":"c0f8bbf1fbf7f6aadbb44d4ad765a5634cb207b6144a847c7f585e96077db265"} Mar 17 02:08:05 crc kubenswrapper[4755]: I0317 02:08:05.446149 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f8bbf1fbf7f6aadbb44d4ad765a5634cb207b6144a847c7f585e96077db265" Mar 17 02:08:05 crc kubenswrapper[4755]: I0317 02:08:05.446179 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561888-jtgqr" Mar 17 02:08:05 crc kubenswrapper[4755]: I0317 02:08:05.997408 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-c7lxm"] Mar 17 02:08:06 crc kubenswrapper[4755]: I0317 02:08:06.009971 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561882-c7lxm"] Mar 17 02:08:06 crc kubenswrapper[4755]: I0317 02:08:06.294315 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e5a5f5-5a38-4a3a-b882-a13045c27110" path="/var/lib/kubelet/pods/f6e5a5f5-5a38-4a3a-b882-a13045c27110/volumes" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.565481 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4xgxq"] Mar 17 02:08:10 crc kubenswrapper[4755]: E0317 02:08:10.567111 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca93d3a0-f76a-46b1-a82a-8be8d6210d90" containerName="oc" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.567129 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca93d3a0-f76a-46b1-a82a-8be8d6210d90" containerName="oc" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.567383 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca93d3a0-f76a-46b1-a82a-8be8d6210d90" containerName="oc" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.571213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.585916 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xgxq"] Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.615609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ddg\" (UniqueName: \"kubernetes.io/projected/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-kube-api-access-d8ddg\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.616060 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-utilities\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.616313 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-catalog-content\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.717829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ddg\" (UniqueName: \"kubernetes.io/projected/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-kube-api-access-d8ddg\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.717932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-utilities\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.718013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-catalog-content\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.718492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-catalog-content\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.718542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-utilities\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.736855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ddg\" (UniqueName: \"kubernetes.io/projected/30c7719a-2835-48a7-a0d7-6fc05b2f0e99-kube-api-access-d8ddg\") pod \"community-operators-4xgxq\" (UID: \"30c7719a-2835-48a7-a0d7-6fc05b2f0e99\") " pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:10 crc kubenswrapper[4755]: I0317 02:08:10.890262 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:11 crc kubenswrapper[4755]: I0317 02:08:11.539622 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xgxq"] Mar 17 02:08:12 crc kubenswrapper[4755]: I0317 02:08:12.516426 4755 generic.go:334] "Generic (PLEG): container finished" podID="30c7719a-2835-48a7-a0d7-6fc05b2f0e99" containerID="639863a9cc8e3b26c610cab00eac3530858d9d2afb918e706a3249a387a07e06" exitCode=0 Mar 17 02:08:12 crc kubenswrapper[4755]: I0317 02:08:12.517607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xgxq" event={"ID":"30c7719a-2835-48a7-a0d7-6fc05b2f0e99","Type":"ContainerDied","Data":"639863a9cc8e3b26c610cab00eac3530858d9d2afb918e706a3249a387a07e06"} Mar 17 02:08:12 crc kubenswrapper[4755]: I0317 02:08:12.517994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xgxq" event={"ID":"30c7719a-2835-48a7-a0d7-6fc05b2f0e99","Type":"ContainerStarted","Data":"9433da08eb8384537bb2f733e0e4b94c6799f6dbd229ce9176da82abcca5a98d"} Mar 17 02:08:19 crc kubenswrapper[4755]: I0317 02:08:19.048491 4755 scope.go:117] "RemoveContainer" containerID="48e8343704e8f9decd51d4fd1dcc371abcf1629189c6e7e8d04f6097c47f939f" Mar 17 02:08:19 crc kubenswrapper[4755]: I0317 02:08:19.620204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xgxq" event={"ID":"30c7719a-2835-48a7-a0d7-6fc05b2f0e99","Type":"ContainerStarted","Data":"67d2fea89e699baf568b162fbe1e003390f13c7256a348c265ae19726e696f1a"} Mar 17 02:08:20 crc kubenswrapper[4755]: I0317 02:08:20.633512 4755 generic.go:334] "Generic (PLEG): container finished" podID="30c7719a-2835-48a7-a0d7-6fc05b2f0e99" containerID="67d2fea89e699baf568b162fbe1e003390f13c7256a348c265ae19726e696f1a" exitCode=0 Mar 17 02:08:20 crc kubenswrapper[4755]: I0317 02:08:20.633613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xgxq" event={"ID":"30c7719a-2835-48a7-a0d7-6fc05b2f0e99","Type":"ContainerDied","Data":"67d2fea89e699baf568b162fbe1e003390f13c7256a348c265ae19726e696f1a"} Mar 17 02:08:21 crc kubenswrapper[4755]: I0317 02:08:21.651507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xgxq" event={"ID":"30c7719a-2835-48a7-a0d7-6fc05b2f0e99","Type":"ContainerStarted","Data":"964d2a7017194a08a7a9afe099a25e07b1f92b881c93021f39716390f1a99b2d"} Mar 17 02:08:21 crc kubenswrapper[4755]: I0317 02:08:21.677459 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4xgxq" podStartSLOduration=2.997009376 podStartE2EDuration="11.677424996s" podCreationTimestamp="2026-03-17 02:08:10 +0000 UTC" firstStartedPulling="2026-03-17 02:08:12.519794752 +0000 UTC m=+6367.279247035" lastFinishedPulling="2026-03-17 02:08:21.200210372 +0000 UTC m=+6375.959662655" observedRunningTime="2026-03-17 02:08:21.675143635 +0000 UTC m=+6376.434595928" watchObservedRunningTime="2026-03-17 02:08:21.677424996 +0000 UTC m=+6376.436877289" Mar 17 02:08:30 crc kubenswrapper[4755]: I0317 02:08:30.891078 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:30 crc kubenswrapper[4755]: I0317 02:08:30.893162 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:30 crc kubenswrapper[4755]: I0317 02:08:30.966633 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:31 crc kubenswrapper[4755]: I0317 02:08:31.833178 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4xgxq" Mar 17 02:08:31 crc kubenswrapper[4755]: I0317 02:08:31.930755 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xgxq"] Mar 17 02:08:31 crc kubenswrapper[4755]: I0317 02:08:31.971181 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 02:08:31 crc kubenswrapper[4755]: I0317 02:08:31.971396 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pv6h6" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="registry-server" containerID="cri-o://a0fd6a1b016965c1e76c9020787ef2cc07c6585fca45be99329b9be2a6e199e2" gracePeriod=2 Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.783605 4755 generic.go:334] "Generic (PLEG): container finished" podID="858527df-ec56-4a19-b64f-735380803797" containerID="a0fd6a1b016965c1e76c9020787ef2cc07c6585fca45be99329b9be2a6e199e2" exitCode=0 Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.783673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerDied","Data":"a0fd6a1b016965c1e76c9020787ef2cc07c6585fca45be99329b9be2a6e199e2"} Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.783906 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pv6h6" event={"ID":"858527df-ec56-4a19-b64f-735380803797","Type":"ContainerDied","Data":"825472bcbf41b959569efe7f0fddcb91f50606fc529bf2c3ab808eb12dc2578c"} Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.783932 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="825472bcbf41b959569efe7f0fddcb91f50606fc529bf2c3ab808eb12dc2578c" Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.822022 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.910907 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content\") pod \"858527df-ec56-4a19-b64f-735380803797\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.911182 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities\") pod \"858527df-ec56-4a19-b64f-735380803797\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.911366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29jt2\" (UniqueName: \"kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2\") pod \"858527df-ec56-4a19-b64f-735380803797\" (UID: \"858527df-ec56-4a19-b64f-735380803797\") " Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.913691 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities" (OuterVolumeSpecName: "utilities") pod "858527df-ec56-4a19-b64f-735380803797" (UID: "858527df-ec56-4a19-b64f-735380803797"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:08:32 crc kubenswrapper[4755]: I0317 02:08:32.932151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2" (OuterVolumeSpecName: "kube-api-access-29jt2") pod "858527df-ec56-4a19-b64f-735380803797" (UID: "858527df-ec56-4a19-b64f-735380803797"). InnerVolumeSpecName "kube-api-access-29jt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.004825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858527df-ec56-4a19-b64f-735380803797" (UID: "858527df-ec56-4a19-b64f-735380803797"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.015301 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.015331 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858527df-ec56-4a19-b64f-735380803797-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.015342 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29jt2\" (UniqueName: \"kubernetes.io/projected/858527df-ec56-4a19-b64f-735380803797-kube-api-access-29jt2\") on node \"crc\" DevicePath \"\"" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.791646 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pv6h6" Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.824029 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 02:08:33 crc kubenswrapper[4755]: I0317 02:08:33.834232 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pv6h6"] Mar 17 02:08:34 crc kubenswrapper[4755]: I0317 02:08:34.271573 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858527df-ec56-4a19-b64f-735380803797" path="/var/lib/kubelet/pods/858527df-ec56-4a19-b64f-735380803797/volumes" Mar 17 02:08:58 crc kubenswrapper[4755]: I0317 02:08:58.667616 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:08:58 crc kubenswrapper[4755]: I0317 02:08:58.671110 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:09:19 crc kubenswrapper[4755]: I0317 02:09:19.439748 4755 scope.go:117] "RemoveContainer" containerID="dc20771934207323bf31c040fdbae8287ebeab0bb54f0a383c658d1101722142" Mar 17 02:09:19 crc kubenswrapper[4755]: I0317 02:09:19.504105 4755 scope.go:117] "RemoveContainer" containerID="a0fd6a1b016965c1e76c9020787ef2cc07c6585fca45be99329b9be2a6e199e2" Mar 17 02:09:19 crc kubenswrapper[4755]: I0317 02:09:19.541859 4755 scope.go:117] "RemoveContainer" containerID="15df6ca71a2f72ab6fa9564ccae9569d55ae0c8b322d1076c492b29da09f3f7b" Mar 17 02:09:28 crc kubenswrapper[4755]: I0317 02:09:28.665678 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:09:28 crc kubenswrapper[4755]: I0317 02:09:28.666298 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.665745 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.666385 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.666475 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.668248 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.668383 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66" gracePeriod=600 Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.934147 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66" exitCode=0 Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.934252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66"} Mar 17 02:09:58 crc kubenswrapper[4755]: I0317 02:09:58.934458 4755 scope.go:117] "RemoveContainer" containerID="0c50fbd644561a5defa8dfd532c4518490b8868c21e4d855c3d2ea2ec3ba56e8" Mar 17 02:09:59 crc kubenswrapper[4755]: I0317 02:09:59.960672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3"} Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.152763 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561890-w2fqk"] Mar 17 02:10:00 crc kubenswrapper[4755]: E0317 02:10:00.153282 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="extract-utilities" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.153303 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="extract-utilities" Mar 17 02:10:00 crc kubenswrapper[4755]: E0317 02:10:00.153336 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.153344 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4755]: E0317 02:10:00.153387 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="extract-content" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.153395 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="extract-content" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.153680 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="858527df-ec56-4a19-b64f-735380803797" containerName="registry-server" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.155762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.157910 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.159538 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.160382 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.169451 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-w2fqk"] Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.258054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnc28\" (UniqueName: \"kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28\") pod \"auto-csr-approver-29561890-w2fqk\" (UID: \"83f0fb4d-f819-4bf2-993f-330ffcdad086\") " pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.362176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnc28\" (UniqueName: \"kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28\") pod \"auto-csr-approver-29561890-w2fqk\" (UID: \"83f0fb4d-f819-4bf2-993f-330ffcdad086\") " pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.384729 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnc28\" (UniqueName: \"kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28\") pod \"auto-csr-approver-29561890-w2fqk\" (UID: \"83f0fb4d-f819-4bf2-993f-330ffcdad086\") " pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.485222 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.982406 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-w2fqk"] Mar 17 02:10:00 crc kubenswrapper[4755]: I0317 02:10:00.995990 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:10:01 crc kubenswrapper[4755]: I0317 02:10:01.982221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" event={"ID":"83f0fb4d-f819-4bf2-993f-330ffcdad086","Type":"ContainerStarted","Data":"c3ffc0e21c700c9129a3b066d2d228ad8bfb0eeb0bf6779e54f595212af3d2ef"} Mar 17 02:10:03 crc kubenswrapper[4755]: I0317 02:10:03.009717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" event={"ID":"83f0fb4d-f819-4bf2-993f-330ffcdad086","Type":"ContainerStarted","Data":"c7ed3698593aae5f609d8f6d38b402eafb6e89d50da2d707671e05719c6af4d0"} Mar 17 02:10:03 crc kubenswrapper[4755]: I0317 02:10:03.038153 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" podStartSLOduration=1.6913538909999999 podStartE2EDuration="3.038128173s" podCreationTimestamp="2026-03-17 02:10:00 +0000 UTC" firstStartedPulling="2026-03-17 02:10:00.99515714 +0000 UTC m=+6475.754609433" lastFinishedPulling="2026-03-17 02:10:02.341931412 +0000 UTC m=+6477.101383715" observedRunningTime="2026-03-17 02:10:03.027481296 +0000 UTC m=+6477.786933589" watchObservedRunningTime="2026-03-17 02:10:03.038128173 +0000 UTC m=+6477.797580466" Mar 17 02:10:04 crc kubenswrapper[4755]: I0317 02:10:04.025019 4755 generic.go:334] "Generic (PLEG): container finished" podID="83f0fb4d-f819-4bf2-993f-330ffcdad086" containerID="c7ed3698593aae5f609d8f6d38b402eafb6e89d50da2d707671e05719c6af4d0" exitCode=0 Mar 17 02:10:04 crc kubenswrapper[4755]: I0317 02:10:04.025073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" event={"ID":"83f0fb4d-f819-4bf2-993f-330ffcdad086","Type":"ContainerDied","Data":"c7ed3698593aae5f609d8f6d38b402eafb6e89d50da2d707671e05719c6af4d0"} Mar 17 02:10:05 crc kubenswrapper[4755]: I0317 02:10:05.601249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:05 crc kubenswrapper[4755]: I0317 02:10:05.715664 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnc28\" (UniqueName: \"kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28\") pod \"83f0fb4d-f819-4bf2-993f-330ffcdad086\" (UID: \"83f0fb4d-f819-4bf2-993f-330ffcdad086\") " Mar 17 02:10:05 crc kubenswrapper[4755]: I0317 02:10:05.740833 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28" (OuterVolumeSpecName: "kube-api-access-hnc28") pod "83f0fb4d-f819-4bf2-993f-330ffcdad086" (UID: "83f0fb4d-f819-4bf2-993f-330ffcdad086"). InnerVolumeSpecName "kube-api-access-hnc28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:10:05 crc kubenswrapper[4755]: I0317 02:10:05.818954 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnc28\" (UniqueName: \"kubernetes.io/projected/83f0fb4d-f819-4bf2-993f-330ffcdad086-kube-api-access-hnc28\") on node \"crc\" DevicePath \"\"" Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.062292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" event={"ID":"83f0fb4d-f819-4bf2-993f-330ffcdad086","Type":"ContainerDied","Data":"c3ffc0e21c700c9129a3b066d2d228ad8bfb0eeb0bf6779e54f595212af3d2ef"} Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.062349 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3ffc0e21c700c9129a3b066d2d228ad8bfb0eeb0bf6779e54f595212af3d2ef" Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.062468 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561890-w2fqk" Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.117521 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-kqm5w"] Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.127832 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561884-kqm5w"] Mar 17 02:10:06 crc kubenswrapper[4755]: I0317 02:10:06.279031 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6aa5b5-b2a0-49b9-bc43-96971c49e39e" path="/var/lib/kubelet/pods/db6aa5b5-b2a0-49b9-bc43-96971c49e39e/volumes" Mar 17 02:10:19 crc kubenswrapper[4755]: I0317 02:10:19.631926 4755 scope.go:117] "RemoveContainer" containerID="adf3874ac3881dc28eff51caf7e92876b6ca9d483ce37aef855615ef6a5b3ddf" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.688839 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:11:16 crc kubenswrapper[4755]: E0317 02:11:16.692251 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f0fb4d-f819-4bf2-993f-330ffcdad086" containerName="oc" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.692425 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f0fb4d-f819-4bf2-993f-330ffcdad086" containerName="oc" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.693282 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f0fb4d-f819-4bf2-993f-330ffcdad086" containerName="oc" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.696346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.726544 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.768883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.769082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.769152 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjdsd\" (UniqueName: \"kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.871046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.871112 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjdsd\" (UniqueName: \"kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.871242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.871694 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.871697 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:16 crc kubenswrapper[4755]: I0317 02:11:16.891320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjdsd\" (UniqueName: \"kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd\") pod \"redhat-operators-lcxkt\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:17 crc kubenswrapper[4755]: I0317 02:11:17.035854 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:17 crc kubenswrapper[4755]: I0317 02:11:17.558897 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:11:17 crc kubenswrapper[4755]: I0317 02:11:17.943509 4755 generic.go:334] "Generic (PLEG): container finished" podID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerID="8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3" exitCode=0 Mar 17 02:11:17 crc kubenswrapper[4755]: I0317 02:11:17.943573 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerDied","Data":"8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3"} Mar 17 02:11:17 crc kubenswrapper[4755]: I0317 02:11:17.943807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerStarted","Data":"6912929f7c8350da7d2a68a8890112afd17ca2fd84840d863e9827bb8b720f79"} Mar 17 02:11:19 crc kubenswrapper[4755]: I0317 02:11:19.980547 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerStarted","Data":"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd"} Mar 17 02:11:26 crc kubenswrapper[4755]: I0317 02:11:26.532265 4755 generic.go:334] "Generic (PLEG): container finished" podID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerID="a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd" exitCode=0 Mar 17 02:11:26 crc kubenswrapper[4755]: I0317 02:11:26.532525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerDied","Data":"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd"} Mar 17 02:11:27 crc kubenswrapper[4755]: I0317 02:11:27.551134 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerStarted","Data":"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70"} Mar 17 02:11:27 crc kubenswrapper[4755]: I0317 02:11:27.571953 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcxkt" podStartSLOduration=2.335510631 podStartE2EDuration="11.571924021s" podCreationTimestamp="2026-03-17 02:11:16 +0000 UTC" firstStartedPulling="2026-03-17 02:11:17.947224027 +0000 UTC m=+6552.706676310" lastFinishedPulling="2026-03-17 02:11:27.183637417 +0000 UTC m=+6561.943089700" observedRunningTime="2026-03-17 02:11:27.568474929 +0000 UTC m=+6562.327927212" watchObservedRunningTime="2026-03-17 02:11:27.571924021 +0000 UTC m=+6562.331376334" Mar 17 02:11:37 crc kubenswrapper[4755]: I0317 02:11:37.035970 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:37 crc kubenswrapper[4755]: I0317 02:11:37.036855 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:11:38 crc kubenswrapper[4755]: I0317 02:11:38.129295 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcxkt" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" probeResult="failure" output=< Mar 17 02:11:38 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:11:38 crc kubenswrapper[4755]: > Mar 17 02:11:48 crc kubenswrapper[4755]: I0317 02:11:48.139118 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcxkt" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" probeResult="failure" output=< Mar 17 02:11:48 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:11:48 crc kubenswrapper[4755]: > Mar 17 02:11:58 crc kubenswrapper[4755]: I0317 02:11:58.183997 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcxkt" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" probeResult="failure" output=< Mar 17 02:11:58 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:11:58 crc kubenswrapper[4755]: > Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.160826 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vwwc8"] Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.162500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.177267 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vwwc8"] Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.196783 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.197074 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.197313 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.204798 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztj9x\" (UniqueName: \"kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x\") pod \"auto-csr-approver-29561892-vwwc8\" (UID: \"49078958-034c-4e30-bf7a-8a8d455564dc\") " pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.307623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztj9x\" (UniqueName: \"kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x\") pod \"auto-csr-approver-29561892-vwwc8\" (UID: \"49078958-034c-4e30-bf7a-8a8d455564dc\") " pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.327719 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztj9x\" (UniqueName: \"kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x\") pod \"auto-csr-approver-29561892-vwwc8\" (UID: \"49078958-034c-4e30-bf7a-8a8d455564dc\") " pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:00 crc kubenswrapper[4755]: I0317 02:12:00.510469 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:01 crc kubenswrapper[4755]: I0317 02:12:01.019147 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vwwc8"] Mar 17 02:12:01 crc kubenswrapper[4755]: I0317 02:12:01.956322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" event={"ID":"49078958-034c-4e30-bf7a-8a8d455564dc","Type":"ContainerStarted","Data":"3811d1655ecf1c07eaaeaff990d6feb8fe630fe91a8df324a324786cfbe1022c"} Mar 17 02:12:02 crc kubenswrapper[4755]: I0317 02:12:02.968471 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" event={"ID":"49078958-034c-4e30-bf7a-8a8d455564dc","Type":"ContainerDied","Data":"4686ccb3ff9344edd2ca51666db01c40d785711eb3cb2c000b7e4dbdc8ea2a77"} Mar 17 02:12:02 crc kubenswrapper[4755]: I0317 02:12:02.968477 4755 generic.go:334] "Generic (PLEG): container finished" podID="49078958-034c-4e30-bf7a-8a8d455564dc" containerID="4686ccb3ff9344edd2ca51666db01c40d785711eb3cb2c000b7e4dbdc8ea2a77" exitCode=0 Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.403125 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.510535 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztj9x\" (UniqueName: \"kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x\") pod \"49078958-034c-4e30-bf7a-8a8d455564dc\" (UID: \"49078958-034c-4e30-bf7a-8a8d455564dc\") " Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.516793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x" (OuterVolumeSpecName: "kube-api-access-ztj9x") pod "49078958-034c-4e30-bf7a-8a8d455564dc" (UID: "49078958-034c-4e30-bf7a-8a8d455564dc"). InnerVolumeSpecName "kube-api-access-ztj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.615420 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztj9x\" (UniqueName: \"kubernetes.io/projected/49078958-034c-4e30-bf7a-8a8d455564dc-kube-api-access-ztj9x\") on node \"crc\" DevicePath \"\"" Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.996420 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" event={"ID":"49078958-034c-4e30-bf7a-8a8d455564dc","Type":"ContainerDied","Data":"3811d1655ecf1c07eaaeaff990d6feb8fe630fe91a8df324a324786cfbe1022c"} Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.996497 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3811d1655ecf1c07eaaeaff990d6feb8fe630fe91a8df324a324786cfbe1022c" Mar 17 02:12:04 crc kubenswrapper[4755]: I0317 02:12:04.996581 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561892-vwwc8" Mar 17 02:12:05 crc kubenswrapper[4755]: I0317 02:12:05.491803 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-kr77g"] Mar 17 02:12:05 crc kubenswrapper[4755]: I0317 02:12:05.502711 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561886-kr77g"] Mar 17 02:12:06 crc kubenswrapper[4755]: I0317 02:12:06.274256 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20600ea8-196c-4d9b-9a44-69bbe24b80d6" path="/var/lib/kubelet/pods/20600ea8-196c-4d9b-9a44-69bbe24b80d6/volumes" Mar 17 02:12:07 crc kubenswrapper[4755]: I0317 02:12:07.110628 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:12:07 crc kubenswrapper[4755]: I0317 02:12:07.164124 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:12:07 crc kubenswrapper[4755]: I0317 02:12:07.352399 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.035972 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcxkt" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" containerID="cri-o://b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70" gracePeriod=2 Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.741578 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.843299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content\") pod \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.843381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities\") pod \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.843405 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjdsd\" (UniqueName: \"kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd\") pod \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\" (UID: \"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54\") " Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.846801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities" (OuterVolumeSpecName: "utilities") pod "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" (UID: "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.851695 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd" (OuterVolumeSpecName: "kube-api-access-fjdsd") pod "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" (UID: "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54"). InnerVolumeSpecName "kube-api-access-fjdsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.947106 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.947302 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjdsd\" (UniqueName: \"kubernetes.io/projected/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-kube-api-access-fjdsd\") on node \"crc\" DevicePath \"\"" Mar 17 02:12:09 crc kubenswrapper[4755]: I0317 02:12:09.978177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" (UID: "abc952e2-8647-46d2-b8d1-a3f4dcdb2b54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.049214 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.050668 4755 generic.go:334] "Generic (PLEG): container finished" podID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerID="b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70" exitCode=0 Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.050718 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerDied","Data":"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70"} Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.050757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcxkt" event={"ID":"abc952e2-8647-46d2-b8d1-a3f4dcdb2b54","Type":"ContainerDied","Data":"6912929f7c8350da7d2a68a8890112afd17ca2fd84840d863e9827bb8b720f79"} Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.050787 4755 scope.go:117] "RemoveContainer" containerID="b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.051036 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcxkt" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.088392 4755 scope.go:117] "RemoveContainer" containerID="a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.096010 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.107824 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcxkt"] Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.118360 4755 scope.go:117] "RemoveContainer" containerID="8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.170655 4755 scope.go:117] "RemoveContainer" containerID="b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70" Mar 17 02:12:10 crc kubenswrapper[4755]: E0317 02:12:10.175347 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70\": container with ID starting with b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70 not found: ID does not exist" containerID="b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.175403 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70"} err="failed to get container status \"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70\": rpc error: code = NotFound desc = could not find container \"b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70\": container with ID starting with b6a67d287bf6087589b6dfddd449adcfc3aadafd02ed26e08f87ae7a02e22d70 not found: ID does not exist" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.175467 4755 scope.go:117] "RemoveContainer" containerID="a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd" Mar 17 02:12:10 crc kubenswrapper[4755]: E0317 02:12:10.176017 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd\": container with ID starting with a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd not found: ID does not exist" containerID="a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.176195 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd"} err="failed to get container status \"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd\": rpc error: code = NotFound desc = could not find container \"a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd\": container with ID starting with a8a28b57edba402842d95309bc51bd00a9ca41277cd00d7458b6bcc2f1834fdd not found: ID does not exist" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.176355 4755 scope.go:117] "RemoveContainer" containerID="8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3" Mar 17 02:12:10 crc kubenswrapper[4755]: E0317 02:12:10.176913 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3\": container with ID starting with 8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3 not found: ID does not exist" containerID="8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.176959 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3"} err="failed to get container status \"8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3\": rpc error: code = NotFound desc = could not find container \"8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3\": container with ID starting with 8017f3abb9562a74271d38f95ce5bbd2277662cf5c89853862c0926804da8ad3 not found: ID does not exist" Mar 17 02:12:10 crc kubenswrapper[4755]: I0317 02:12:10.273722 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" path="/var/lib/kubelet/pods/abc952e2-8647-46d2-b8d1-a3f4dcdb2b54/volumes" Mar 17 02:12:19 crc kubenswrapper[4755]: I0317 02:12:19.771605 4755 scope.go:117] "RemoveContainer" containerID="d085afa52cfd68068d6d537a2fa29d597ba880b03f7a11ea714bc0839f3a86e7" Mar 17 02:12:28 crc kubenswrapper[4755]: I0317 02:12:28.664955 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:12:28 crc kubenswrapper[4755]: I0317 02:12:28.665597 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:12:58 crc kubenswrapper[4755]: I0317 02:12:58.665152 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:12:58 crc kubenswrapper[4755]: I0317 02:12:58.665771 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:13:28 crc kubenswrapper[4755]: I0317 02:13:28.666041 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:13:28 crc kubenswrapper[4755]: I0317 02:13:28.666802 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:13:28 crc kubenswrapper[4755]: I0317 02:13:28.666871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:13:28 crc kubenswrapper[4755]: I0317 02:13:28.668073 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:13:28 crc kubenswrapper[4755]: I0317 02:13:28.668157 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" gracePeriod=600 Mar 17 02:13:28 crc kubenswrapper[4755]: E0317 02:13:28.799290 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:13:29 crc kubenswrapper[4755]: I0317 02:13:29.058920 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" exitCode=0 Mar 17 02:13:29 crc kubenswrapper[4755]: I0317 02:13:29.059009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3"} Mar 17 02:13:29 crc kubenswrapper[4755]: I0317 02:13:29.059546 4755 scope.go:117] "RemoveContainer" containerID="154aabea7aae51dbcb66f68c199f22f0661e62f9ea3ce4a9745deed080c2af66" Mar 17 02:13:29 crc kubenswrapper[4755]: I0317 02:13:29.060448 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:13:29 crc kubenswrapper[4755]: E0317 02:13:29.061060 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:13:41 crc kubenswrapper[4755]: I0317 02:13:41.250303 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:13:41 crc kubenswrapper[4755]: E0317 02:13:41.251653 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:13:53 crc kubenswrapper[4755]: I0317 02:13:53.249471 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:13:53 crc kubenswrapper[4755]: E0317 02:13:53.250355 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.170887 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561894-wbwrx"] Mar 17 02:14:00 crc kubenswrapper[4755]: E0317 02:14:00.171753 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49078958-034c-4e30-bf7a-8a8d455564dc" containerName="oc" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.171765 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="49078958-034c-4e30-bf7a-8a8d455564dc" containerName="oc" Mar 17 02:14:00 crc kubenswrapper[4755]: E0317 02:14:00.171779 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.171786 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="extract-utilities" Mar 17 02:14:00 crc kubenswrapper[4755]: E0317 02:14:00.171801 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.171808 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="extract-content" Mar 17 02:14:00 crc kubenswrapper[4755]: E0317 02:14:00.171819 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.171825 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.172060 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="49078958-034c-4e30-bf7a-8a8d455564dc" containerName="oc" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.172082 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc952e2-8647-46d2-b8d1-a3f4dcdb2b54" containerName="registry-server" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.173002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.177117 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.177409 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.178590 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.198737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-wbwrx"] Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.327225 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clt7f\" (UniqueName: \"kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f\") pod \"auto-csr-approver-29561894-wbwrx\" (UID: \"07e2dbc4-2f19-4595-b789-827ae99a8fde\") " pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.430551 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clt7f\" (UniqueName: \"kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f\") pod \"auto-csr-approver-29561894-wbwrx\" (UID: \"07e2dbc4-2f19-4595-b789-827ae99a8fde\") " pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.466184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clt7f\" (UniqueName: \"kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f\") pod \"auto-csr-approver-29561894-wbwrx\" (UID: \"07e2dbc4-2f19-4595-b789-827ae99a8fde\") " pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:00 crc kubenswrapper[4755]: I0317 02:14:00.526317 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:01 crc kubenswrapper[4755]: I0317 02:14:01.085519 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-wbwrx"] Mar 17 02:14:01 crc kubenswrapper[4755]: I0317 02:14:01.525100 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" event={"ID":"07e2dbc4-2f19-4595-b789-827ae99a8fde","Type":"ContainerStarted","Data":"b17312c5200082817e610565c74e077039d83ba476ca594201e7d19dbc096362"} Mar 17 02:14:03 crc kubenswrapper[4755]: I0317 02:14:03.576335 4755 generic.go:334] "Generic (PLEG): container finished" podID="07e2dbc4-2f19-4595-b789-827ae99a8fde" containerID="b28e2b6c968677e9d549a8719247ee8d18b1c5b2c672eca71369e8e332df21ea" exitCode=0 Mar 17 02:14:03 crc kubenswrapper[4755]: I0317 02:14:03.576400 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" event={"ID":"07e2dbc4-2f19-4595-b789-827ae99a8fde","Type":"ContainerDied","Data":"b28e2b6c968677e9d549a8719247ee8d18b1c5b2c672eca71369e8e332df21ea"} Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.128149 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.297174 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clt7f\" (UniqueName: \"kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f\") pod \"07e2dbc4-2f19-4595-b789-827ae99a8fde\" (UID: \"07e2dbc4-2f19-4595-b789-827ae99a8fde\") " Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.309832 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f" (OuterVolumeSpecName: "kube-api-access-clt7f") pod "07e2dbc4-2f19-4595-b789-827ae99a8fde" (UID: "07e2dbc4-2f19-4595-b789-827ae99a8fde"). InnerVolumeSpecName "kube-api-access-clt7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.402115 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clt7f\" (UniqueName: \"kubernetes.io/projected/07e2dbc4-2f19-4595-b789-827ae99a8fde-kube-api-access-clt7f\") on node \"crc\" DevicePath \"\"" Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.608820 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" event={"ID":"07e2dbc4-2f19-4595-b789-827ae99a8fde","Type":"ContainerDied","Data":"b17312c5200082817e610565c74e077039d83ba476ca594201e7d19dbc096362"} Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.608884 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17312c5200082817e610565c74e077039d83ba476ca594201e7d19dbc096362" Mar 17 02:14:05 crc kubenswrapper[4755]: I0317 02:14:05.609344 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561894-wbwrx" Mar 17 02:14:06 crc kubenswrapper[4755]: I0317 02:14:06.229752 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-jtgqr"] Mar 17 02:14:06 crc kubenswrapper[4755]: I0317 02:14:06.244407 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561888-jtgqr"] Mar 17 02:14:06 crc kubenswrapper[4755]: I0317 02:14:06.273108 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca93d3a0-f76a-46b1-a82a-8be8d6210d90" path="/var/lib/kubelet/pods/ca93d3a0-f76a-46b1-a82a-8be8d6210d90/volumes" Mar 17 02:14:07 crc kubenswrapper[4755]: I0317 02:14:07.251427 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:14:07 crc kubenswrapper[4755]: E0317 02:14:07.253600 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:19 crc kubenswrapper[4755]: I0317 02:14:19.249068 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:14:19 crc kubenswrapper[4755]: E0317 02:14:19.250297 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:19 crc kubenswrapper[4755]: I0317 02:14:19.949809 4755 scope.go:117] "RemoveContainer" containerID="f8641fc3a2d6dd11c4ceced17e31d6a6e17a98a89a544ddd177fbb424fb02a62" Mar 17 02:14:33 crc kubenswrapper[4755]: I0317 02:14:33.249645 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:14:33 crc kubenswrapper[4755]: E0317 02:14:33.250922 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:48 crc kubenswrapper[4755]: I0317 02:14:48.249795 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:14:48 crc kubenswrapper[4755]: E0317 02:14:48.251123 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.032213 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:14:59 crc kubenswrapper[4755]: E0317 02:14:59.033770 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e2dbc4-2f19-4595-b789-827ae99a8fde" containerName="oc" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.033791 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e2dbc4-2f19-4595-b789-827ae99a8fde" containerName="oc" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.034252 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e2dbc4-2f19-4595-b789-827ae99a8fde" containerName="oc" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.054040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.060845 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.133790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsx6\" (UniqueName: \"kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.133832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.134076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.240398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsx6\" (UniqueName: \"kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.240464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.240555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.241212 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.246813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.249966 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:14:59 crc kubenswrapper[4755]: E0317 02:14:59.250350 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.267200 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsx6\" (UniqueName: \"kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6\") pod \"certified-operators-m74s2\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.380470 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:14:59 crc kubenswrapper[4755]: I0317 02:14:59.913189 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.155064 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f"] Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.156669 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.158301 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.159034 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.172255 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f"] Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.260733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.260792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.260915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccqds\" (UniqueName: \"kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.347515 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerID="e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619" exitCode=0 Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.347555 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerDied","Data":"e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619"} Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.347579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerStarted","Data":"7213ddfe5c368f7c5f7794d94e0d2c6c9da98351f274bbc3f21a8cb94889b697"} Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.363578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccqds\" (UniqueName: \"kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.363836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.363884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.365971 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.379184 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.394271 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccqds\" (UniqueName: \"kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds\") pod \"collect-profiles-29561895-f2v9f\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:00 crc kubenswrapper[4755]: I0317 02:15:00.492810 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:01 crc kubenswrapper[4755]: I0317 02:15:01.067512 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f"] Mar 17 02:15:01 crc kubenswrapper[4755]: I0317 02:15:01.363001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" event={"ID":"1f208726-bc34-4d55-a4e4-986858773314","Type":"ContainerStarted","Data":"276adc9f41fc541be47ebc2180541f32666b9b17f5da66e752bf39dbffc686df"} Mar 17 02:15:01 crc kubenswrapper[4755]: I0317 02:15:01.363373 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" event={"ID":"1f208726-bc34-4d55-a4e4-986858773314","Type":"ContainerStarted","Data":"535d91ad7dd6f9068ca739fe06567b1d0811c58908f64016c9809ac55b68f355"} Mar 17 02:15:01 crc kubenswrapper[4755]: I0317 02:15:01.366704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerStarted","Data":"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144"} Mar 17 02:15:01 crc kubenswrapper[4755]: I0317 02:15:01.389580 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" podStartSLOduration=1.389557616 podStartE2EDuration="1.389557616s" podCreationTimestamp="2026-03-17 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:15:01.382914118 +0000 UTC m=+6776.142366431" watchObservedRunningTime="2026-03-17 02:15:01.389557616 +0000 UTC m=+6776.149009939" Mar 17 02:15:02 crc kubenswrapper[4755]: E0317 02:15:02.014665 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f208726_bc34_4d55_a4e4_986858773314.slice/crio-276adc9f41fc541be47ebc2180541f32666b9b17f5da66e752bf39dbffc686df.scope\": RecentStats: unable to find data in memory cache]" Mar 17 02:15:02 crc kubenswrapper[4755]: I0317 02:15:02.382396 4755 generic.go:334] "Generic (PLEG): container finished" podID="1f208726-bc34-4d55-a4e4-986858773314" containerID="276adc9f41fc541be47ebc2180541f32666b9b17f5da66e752bf39dbffc686df" exitCode=0 Mar 17 02:15:02 crc kubenswrapper[4755]: I0317 02:15:02.384360 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" event={"ID":"1f208726-bc34-4d55-a4e4-986858773314","Type":"ContainerDied","Data":"276adc9f41fc541be47ebc2180541f32666b9b17f5da66e752bf39dbffc686df"} Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.401563 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerID="728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144" exitCode=0 Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.401693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerDied","Data":"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144"} Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.408953 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.895638 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.955418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume\") pod \"1f208726-bc34-4d55-a4e4-986858773314\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.955474 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccqds\" (UniqueName: \"kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds\") pod \"1f208726-bc34-4d55-a4e4-986858773314\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.955636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume\") pod \"1f208726-bc34-4d55-a4e4-986858773314\" (UID: \"1f208726-bc34-4d55-a4e4-986858773314\") " Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.956292 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume" (OuterVolumeSpecName: "config-volume") pod "1f208726-bc34-4d55-a4e4-986858773314" (UID: "1f208726-bc34-4d55-a4e4-986858773314"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.970227 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds" (OuterVolumeSpecName: "kube-api-access-ccqds") pod "1f208726-bc34-4d55-a4e4-986858773314" (UID: "1f208726-bc34-4d55-a4e4-986858773314"). InnerVolumeSpecName "kube-api-access-ccqds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:15:03 crc kubenswrapper[4755]: I0317 02:15:03.970624 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1f208726-bc34-4d55-a4e4-986858773314" (UID: "1f208726-bc34-4d55-a4e4-986858773314"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.058486 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f208726-bc34-4d55-a4e4-986858773314-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.058528 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccqds\" (UniqueName: \"kubernetes.io/projected/1f208726-bc34-4d55-a4e4-986858773314-kube-api-access-ccqds\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.058562 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f208726-bc34-4d55-a4e4-986858773314-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.413686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerStarted","Data":"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b"} Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.415803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" event={"ID":"1f208726-bc34-4d55-a4e4-986858773314","Type":"ContainerDied","Data":"535d91ad7dd6f9068ca739fe06567b1d0811c58908f64016c9809ac55b68f355"} Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.415862 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535d91ad7dd6f9068ca739fe06567b1d0811c58908f64016c9809ac55b68f355" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.415975 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561895-f2v9f" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.445798 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m74s2" podStartSLOduration=2.86476019 podStartE2EDuration="6.445776674s" podCreationTimestamp="2026-03-17 02:14:58 +0000 UTC" firstStartedPulling="2026-03-17 02:15:00.349823458 +0000 UTC m=+6775.109275741" lastFinishedPulling="2026-03-17 02:15:03.930839952 +0000 UTC m=+6778.690292225" observedRunningTime="2026-03-17 02:15:04.436343539 +0000 UTC m=+6779.195795832" watchObservedRunningTime="2026-03-17 02:15:04.445776674 +0000 UTC m=+6779.205228997" Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.486567 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj"] Mar 17 02:15:04 crc kubenswrapper[4755]: I0317 02:15:04.499848 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561850-4x7zj"] Mar 17 02:15:06 crc kubenswrapper[4755]: I0317 02:15:06.264804 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27dace0-fabc-4954-b257-8c7911349adc" path="/var/lib/kubelet/pods/f27dace0-fabc-4954-b257-8c7911349adc/volumes" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.517962 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:07 crc kubenswrapper[4755]: E0317 02:15:07.519097 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f208726-bc34-4d55-a4e4-986858773314" containerName="collect-profiles" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.519112 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f208726-bc34-4d55-a4e4-986858773314" containerName="collect-profiles" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.519338 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f208726-bc34-4d55-a4e4-986858773314" containerName="collect-profiles" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.520809 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.537353 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.569843 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.570524 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.570954 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92kj\" (UniqueName: \"kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.672845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92kj\" (UniqueName: \"kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.672910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.673162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.673630 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.673637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.692762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92kj\" (UniqueName: \"kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj\") pod \"redhat-marketplace-mkvvp\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:07 crc kubenswrapper[4755]: I0317 02:15:07.852370 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:08 crc kubenswrapper[4755]: W0317 02:15:08.397101 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca03540e_1ec8_4206_b4cf_349fac0bc672.slice/crio-0ddef0e3454c49911db10ca942727f766bb8125d61f331ec73d7dee096bab7ce WatchSource:0}: Error finding container 0ddef0e3454c49911db10ca942727f766bb8125d61f331ec73d7dee096bab7ce: Status 404 returned error can't find the container with id 0ddef0e3454c49911db10ca942727f766bb8125d61f331ec73d7dee096bab7ce Mar 17 02:15:08 crc kubenswrapper[4755]: I0317 02:15:08.397294 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:08 crc kubenswrapper[4755]: I0317 02:15:08.497729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerStarted","Data":"0ddef0e3454c49911db10ca942727f766bb8125d61f331ec73d7dee096bab7ce"} Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.380729 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.381166 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.467024 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.512140 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerID="fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928" exitCode=0 Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.512257 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerDied","Data":"fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928"} Mar 17 02:15:09 crc kubenswrapper[4755]: I0317 02:15:09.604506 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:10 crc kubenswrapper[4755]: I0317 02:15:10.521568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerStarted","Data":"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f"} Mar 17 02:15:11 crc kubenswrapper[4755]: I0317 02:15:11.878661 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:15:11 crc kubenswrapper[4755]: I0317 02:15:11.879111 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m74s2" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="registry-server" containerID="cri-o://a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b" gracePeriod=2 Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.252879 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:15:12 crc kubenswrapper[4755]: E0317 02:15:12.253971 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.433106 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.494855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities\") pod \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.495163 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content\") pod \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.495294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqsx6\" (UniqueName: \"kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6\") pod \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\" (UID: \"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6\") " Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.500422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities" (OuterVolumeSpecName: "utilities") pod "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" (UID: "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.505708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6" (OuterVolumeSpecName: "kube-api-access-nqsx6") pod "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" (UID: "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6"). InnerVolumeSpecName "kube-api-access-nqsx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.547004 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" (UID: "1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.549805 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerID="055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f" exitCode=0 Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.549879 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerDied","Data":"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f"} Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.553072 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerID="a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b" exitCode=0 Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.553114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerDied","Data":"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b"} Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.553163 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m74s2" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.553188 4755 scope.go:117] "RemoveContainer" containerID="a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.553171 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m74s2" event={"ID":"1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6","Type":"ContainerDied","Data":"7213ddfe5c368f7c5f7794d94e0d2c6c9da98351f274bbc3f21a8cb94889b697"} Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.598515 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.598547 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.598559 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqsx6\" (UniqueName: \"kubernetes.io/projected/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6-kube-api-access-nqsx6\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.599094 4755 scope.go:117] "RemoveContainer" containerID="728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.619460 4755 scope.go:117] "RemoveContainer" containerID="e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.625630 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.645839 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m74s2"] Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.676213 4755 scope.go:117] "RemoveContainer" containerID="a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b" Mar 17 02:15:12 crc kubenswrapper[4755]: E0317 02:15:12.676933 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b\": container with ID starting with a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b not found: ID does not exist" containerID="a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.676974 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b"} err="failed to get container status \"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b\": rpc error: code = NotFound desc = could not find container \"a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b\": container with ID starting with a6801f2ed0b57fd37cc53a7e3a02b2c2b4ebee34f3a514e69882b8fc4cec004b not found: ID does not exist" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.677001 4755 scope.go:117] "RemoveContainer" containerID="728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144" Mar 17 02:15:12 crc kubenswrapper[4755]: E0317 02:15:12.677416 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144\": container with ID starting with 728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144 not found: ID does not exist" containerID="728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.677533 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144"} err="failed to get container status \"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144\": rpc error: code = NotFound desc = could not find container \"728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144\": container with ID starting with 728e8bc394d5f592bb3dfeaec8be9c6de48cadbb71c60e92c9f42e93ad909144 not found: ID does not exist" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.677556 4755 scope.go:117] "RemoveContainer" containerID="e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619" Mar 17 02:15:12 crc kubenswrapper[4755]: E0317 02:15:12.677903 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619\": container with ID starting with e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619 not found: ID does not exist" containerID="e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619" Mar 17 02:15:12 crc kubenswrapper[4755]: I0317 02:15:12.677954 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619"} err="failed to get container status \"e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619\": rpc error: code = NotFound desc = could not find container \"e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619\": container with ID starting with e17e073164432a170a152ccef63ef2e9f4fe43a817630d0de4d0c48d7a3cf619 not found: ID does not exist" Mar 17 02:15:13 crc kubenswrapper[4755]: I0317 02:15:13.567595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerStarted","Data":"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153"} Mar 17 02:15:13 crc kubenswrapper[4755]: I0317 02:15:13.613134 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkvvp" podStartSLOduration=3.151329501 podStartE2EDuration="6.61311223s" podCreationTimestamp="2026-03-17 02:15:07 +0000 UTC" firstStartedPulling="2026-03-17 02:15:09.514584414 +0000 UTC m=+6784.274036707" lastFinishedPulling="2026-03-17 02:15:12.976367113 +0000 UTC m=+6787.735819436" observedRunningTime="2026-03-17 02:15:13.591163728 +0000 UTC m=+6788.350616021" watchObservedRunningTime="2026-03-17 02:15:13.61311223 +0000 UTC m=+6788.372564523" Mar 17 02:15:14 crc kubenswrapper[4755]: I0317 02:15:14.265565 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" path="/var/lib/kubelet/pods/1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6/volumes" Mar 17 02:15:17 crc kubenswrapper[4755]: I0317 02:15:17.853639 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:17 crc kubenswrapper[4755]: I0317 02:15:17.854314 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:17 crc kubenswrapper[4755]: I0317 02:15:17.938746 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:18 crc kubenswrapper[4755]: I0317 02:15:18.720094 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:18 crc kubenswrapper[4755]: I0317 02:15:18.786758 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:20 crc kubenswrapper[4755]: I0317 02:15:20.064071 4755 scope.go:117] "RemoveContainer" containerID="b4907f437e505aaf6fd8cb7800bc64ea926a4385da834c2b5d9e06b4f40803c0" Mar 17 02:15:20 crc kubenswrapper[4755]: I0317 02:15:20.666596 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkvvp" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="registry-server" containerID="cri-o://853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153" gracePeriod=2 Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.276301 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.356468 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities\") pod \"ca03540e-1ec8-4206-b4cf-349fac0bc672\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.356531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92kj\" (UniqueName: \"kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj\") pod \"ca03540e-1ec8-4206-b4cf-349fac0bc672\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.356656 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content\") pod \"ca03540e-1ec8-4206-b4cf-349fac0bc672\" (UID: \"ca03540e-1ec8-4206-b4cf-349fac0bc672\") " Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.357776 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities" (OuterVolumeSpecName: "utilities") pod "ca03540e-1ec8-4206-b4cf-349fac0bc672" (UID: "ca03540e-1ec8-4206-b4cf-349fac0bc672"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.362495 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj" (OuterVolumeSpecName: "kube-api-access-m92kj") pod "ca03540e-1ec8-4206-b4cf-349fac0bc672" (UID: "ca03540e-1ec8-4206-b4cf-349fac0bc672"). InnerVolumeSpecName "kube-api-access-m92kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.409095 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca03540e-1ec8-4206-b4cf-349fac0bc672" (UID: "ca03540e-1ec8-4206-b4cf-349fac0bc672"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.459201 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.459244 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca03540e-1ec8-4206-b4cf-349fac0bc672-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.459258 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92kj\" (UniqueName: \"kubernetes.io/projected/ca03540e-1ec8-4206-b4cf-349fac0bc672-kube-api-access-m92kj\") on node \"crc\" DevicePath \"\"" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.681014 4755 generic.go:334] "Generic (PLEG): container finished" podID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerID="853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153" exitCode=0 Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.681070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerDied","Data":"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153"} Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.681101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkvvp" event={"ID":"ca03540e-1ec8-4206-b4cf-349fac0bc672","Type":"ContainerDied","Data":"0ddef0e3454c49911db10ca942727f766bb8125d61f331ec73d7dee096bab7ce"} Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.681121 4755 scope.go:117] "RemoveContainer" containerID="853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.681292 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkvvp" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.711472 4755 scope.go:117] "RemoveContainer" containerID="055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.726253 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.741255 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkvvp"] Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.757709 4755 scope.go:117] "RemoveContainer" containerID="fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.815681 4755 scope.go:117] "RemoveContainer" containerID="853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153" Mar 17 02:15:21 crc kubenswrapper[4755]: E0317 02:15:21.816328 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153\": container with ID starting with 853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153 not found: ID does not exist" containerID="853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.816365 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153"} err="failed to get container status \"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153\": rpc error: code = NotFound desc = could not find container \"853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153\": container with ID starting with 853d5e8b4009d60a4ccef7171b615c5e7eb2a21a3ec5868c2868b07d4a2fb153 not found: ID does not exist" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.816391 4755 scope.go:117] "RemoveContainer" containerID="055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f" Mar 17 02:15:21 crc kubenswrapper[4755]: E0317 02:15:21.816846 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f\": container with ID starting with 055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f not found: ID does not exist" containerID="055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.816890 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f"} err="failed to get container status \"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f\": rpc error: code = NotFound desc = could not find container \"055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f\": container with ID starting with 055681fa37d2e56608852d59c2442c88f7f4d27d0c2b06baf6c94c8f6c60d81f not found: ID does not exist" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.816919 4755 scope.go:117] "RemoveContainer" containerID="fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928" Mar 17 02:15:21 crc kubenswrapper[4755]: E0317 02:15:21.817191 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928\": container with ID starting with fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928 not found: ID does not exist" containerID="fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928" Mar 17 02:15:21 crc kubenswrapper[4755]: I0317 02:15:21.817223 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928"} err="failed to get container status \"fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928\": rpc error: code = NotFound desc = could not find container \"fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928\": container with ID starting with fbb22fa8cb0cab12a885557b1f627412e08f8d8757ae627211ad99242d63c928 not found: ID does not exist" Mar 17 02:15:22 crc kubenswrapper[4755]: I0317 02:15:22.259540 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" path="/var/lib/kubelet/pods/ca03540e-1ec8-4206-b4cf-349fac0bc672/volumes" Mar 17 02:15:26 crc kubenswrapper[4755]: I0317 02:15:26.263601 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:15:26 crc kubenswrapper[4755]: E0317 02:15:26.264782 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:15:39 crc kubenswrapper[4755]: I0317 02:15:39.248758 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:15:39 crc kubenswrapper[4755]: E0317 02:15:39.249527 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:15:54 crc kubenswrapper[4755]: I0317 02:15:54.248922 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:15:54 crc kubenswrapper[4755]: E0317 02:15:54.250168 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.155957 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561896-z6wmk"] Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.156947 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="extract-content" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.156963 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="extract-content" Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.156983 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.156991 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.157008 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157016 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.157042 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="extract-utilities" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157049 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="extract-utilities" Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.157063 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="extract-content" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157071 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="extract-content" Mar 17 02:16:00 crc kubenswrapper[4755]: E0317 02:16:00.157096 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="extract-utilities" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157105 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="extract-utilities" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157350 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca03540e-1ec8-4206-b4cf-349fac0bc672" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.157392 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef819a6-9f4b-4b41-bd62-e9c8c133a0d6" containerName="registry-server" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.158331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.160885 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.160928 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.162476 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.176569 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-z6wmk"] Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.285133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqtr\" (UniqueName: \"kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr\") pod \"auto-csr-approver-29561896-z6wmk\" (UID: \"af1148c0-e5c9-4aab-b273-91bbdfeb062b\") " pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.388183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqtr\" (UniqueName: \"kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr\") pod \"auto-csr-approver-29561896-z6wmk\" (UID: \"af1148c0-e5c9-4aab-b273-91bbdfeb062b\") " pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.420918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqtr\" (UniqueName: \"kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr\") pod \"auto-csr-approver-29561896-z6wmk\" (UID: \"af1148c0-e5c9-4aab-b273-91bbdfeb062b\") " pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.483580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:00 crc kubenswrapper[4755]: I0317 02:16:00.974061 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-z6wmk"] Mar 17 02:16:00 crc kubenswrapper[4755]: W0317 02:16:00.979302 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1148c0_e5c9_4aab_b273_91bbdfeb062b.slice/crio-53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353 WatchSource:0}: Error finding container 53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353: Status 404 returned error can't find the container with id 53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353 Mar 17 02:16:01 crc kubenswrapper[4755]: I0317 02:16:01.224635 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" event={"ID":"af1148c0-e5c9-4aab-b273-91bbdfeb062b","Type":"ContainerStarted","Data":"53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353"} Mar 17 02:16:02 crc kubenswrapper[4755]: I0317 02:16:02.236174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" event={"ID":"af1148c0-e5c9-4aab-b273-91bbdfeb062b","Type":"ContainerStarted","Data":"e21bdc07bd33d28639ecd6680f7f302cf666a91c88b3b9f96f8550df6d06ae90"} Mar 17 02:16:02 crc kubenswrapper[4755]: I0317 02:16:02.263705 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" podStartSLOduration=1.317699012 podStartE2EDuration="2.263619s" podCreationTimestamp="2026-03-17 02:16:00 +0000 UTC" firstStartedPulling="2026-03-17 02:16:00.986751814 +0000 UTC m=+6835.746204137" lastFinishedPulling="2026-03-17 02:16:01.932671842 +0000 UTC m=+6836.692124125" observedRunningTime="2026-03-17 02:16:02.253255641 +0000 UTC m=+6837.012707944" watchObservedRunningTime="2026-03-17 02:16:02.263619 +0000 UTC m=+6837.023071303" Mar 17 02:16:03 crc kubenswrapper[4755]: I0317 02:16:03.249871 4755 generic.go:334] "Generic (PLEG): container finished" podID="af1148c0-e5c9-4aab-b273-91bbdfeb062b" containerID="e21bdc07bd33d28639ecd6680f7f302cf666a91c88b3b9f96f8550df6d06ae90" exitCode=0 Mar 17 02:16:03 crc kubenswrapper[4755]: I0317 02:16:03.249947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" event={"ID":"af1148c0-e5c9-4aab-b273-91bbdfeb062b","Type":"ContainerDied","Data":"e21bdc07bd33d28639ecd6680f7f302cf666a91c88b3b9f96f8550df6d06ae90"} Mar 17 02:16:04 crc kubenswrapper[4755]: I0317 02:16:04.769748 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:04 crc kubenswrapper[4755]: I0317 02:16:04.923225 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzqtr\" (UniqueName: \"kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr\") pod \"af1148c0-e5c9-4aab-b273-91bbdfeb062b\" (UID: \"af1148c0-e5c9-4aab-b273-91bbdfeb062b\") " Mar 17 02:16:04 crc kubenswrapper[4755]: I0317 02:16:04.929950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr" (OuterVolumeSpecName: "kube-api-access-kzqtr") pod "af1148c0-e5c9-4aab-b273-91bbdfeb062b" (UID: "af1148c0-e5c9-4aab-b273-91bbdfeb062b"). InnerVolumeSpecName "kube-api-access-kzqtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.025903 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzqtr\" (UniqueName: \"kubernetes.io/projected/af1148c0-e5c9-4aab-b273-91bbdfeb062b-kube-api-access-kzqtr\") on node \"crc\" DevicePath \"\"" Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.275986 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" event={"ID":"af1148c0-e5c9-4aab-b273-91bbdfeb062b","Type":"ContainerDied","Data":"53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353"} Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.276033 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53fab39f8f95848ad142b54e7c74f024e5d90098cdbbb32e560acf79ea7db353" Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.276052 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561896-z6wmk" Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.356088 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-w2fqk"] Mar 17 02:16:05 crc kubenswrapper[4755]: I0317 02:16:05.368066 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561890-w2fqk"] Mar 17 02:16:06 crc kubenswrapper[4755]: I0317 02:16:06.269304 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f0fb4d-f819-4bf2-993f-330ffcdad086" path="/var/lib/kubelet/pods/83f0fb4d-f819-4bf2-993f-330ffcdad086/volumes" Mar 17 02:16:07 crc kubenswrapper[4755]: I0317 02:16:07.248866 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:16:07 crc kubenswrapper[4755]: E0317 02:16:07.249405 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:16:19 crc kubenswrapper[4755]: I0317 02:16:19.248887 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:16:19 crc kubenswrapper[4755]: E0317 02:16:19.250016 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:16:20 crc kubenswrapper[4755]: I0317 02:16:20.166099 4755 scope.go:117] "RemoveContainer" containerID="c7ed3698593aae5f609d8f6d38b402eafb6e89d50da2d707671e05719c6af4d0" Mar 17 02:16:34 crc kubenswrapper[4755]: I0317 02:16:34.248855 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:16:34 crc kubenswrapper[4755]: E0317 02:16:34.250122 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:16:49 crc kubenswrapper[4755]: I0317 02:16:49.248481 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:16:49 crc kubenswrapper[4755]: E0317 02:16:49.249188 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:17:00 crc kubenswrapper[4755]: I0317 02:17:00.251125 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:17:00 crc kubenswrapper[4755]: E0317 02:17:00.252254 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:17:12 crc kubenswrapper[4755]: I0317 02:17:12.249097 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:17:12 crc kubenswrapper[4755]: E0317 02:17:12.249954 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:17:26 crc kubenswrapper[4755]: I0317 02:17:26.264933 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:17:26 crc kubenswrapper[4755]: E0317 02:17:26.266290 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:17:38 crc kubenswrapper[4755]: I0317 02:17:38.248976 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:17:38 crc kubenswrapper[4755]: E0317 02:17:38.250216 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:17:51 crc kubenswrapper[4755]: I0317 02:17:51.248729 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:17:51 crc kubenswrapper[4755]: E0317 02:17:51.249391 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.176026 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561898-gg8mq"] Mar 17 02:18:00 crc kubenswrapper[4755]: E0317 02:18:00.177418 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1148c0-e5c9-4aab-b273-91bbdfeb062b" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.177466 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1148c0-e5c9-4aab-b273-91bbdfeb062b" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.177874 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1148c0-e5c9-4aab-b273-91bbdfeb062b" containerName="oc" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.179108 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.186554 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-gg8mq"] Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.203622 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.203759 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.205028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.292923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gtz\" (UniqueName: \"kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz\") pod \"auto-csr-approver-29561898-gg8mq\" (UID: \"6ff717d6-72df-477f-adb7-abb753ea464d\") " pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.396061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gtz\" (UniqueName: \"kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz\") pod \"auto-csr-approver-29561898-gg8mq\" (UID: \"6ff717d6-72df-477f-adb7-abb753ea464d\") " pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.431884 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gtz\" (UniqueName: \"kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz\") pod \"auto-csr-approver-29561898-gg8mq\" (UID: \"6ff717d6-72df-477f-adb7-abb753ea464d\") " pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:00 crc kubenswrapper[4755]: I0317 02:18:00.525990 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:01 crc kubenswrapper[4755]: I0317 02:18:01.098663 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-gg8mq"] Mar 17 02:18:01 crc kubenswrapper[4755]: I0317 02:18:01.710963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" event={"ID":"6ff717d6-72df-477f-adb7-abb753ea464d","Type":"ContainerStarted","Data":"51ee270e3a27371a3bd58614faeba81dd7ce3eb6052f49edea9d22a701908d6e"} Mar 17 02:18:02 crc kubenswrapper[4755]: I0317 02:18:02.727112 4755 generic.go:334] "Generic (PLEG): container finished" podID="6ff717d6-72df-477f-adb7-abb753ea464d" containerID="1e3d6d49001395af1d061f1a5eff7da46b7452deb74a4ff41772cb32608331f9" exitCode=0 Mar 17 02:18:02 crc kubenswrapper[4755]: I0317 02:18:02.727240 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" event={"ID":"6ff717d6-72df-477f-adb7-abb753ea464d","Type":"ContainerDied","Data":"1e3d6d49001395af1d061f1a5eff7da46b7452deb74a4ff41772cb32608331f9"} Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.166278 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.299832 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8gtz\" (UniqueName: \"kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz\") pod \"6ff717d6-72df-477f-adb7-abb753ea464d\" (UID: \"6ff717d6-72df-477f-adb7-abb753ea464d\") " Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.306793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz" (OuterVolumeSpecName: "kube-api-access-f8gtz") pod "6ff717d6-72df-477f-adb7-abb753ea464d" (UID: "6ff717d6-72df-477f-adb7-abb753ea464d"). InnerVolumeSpecName "kube-api-access-f8gtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.403707 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8gtz\" (UniqueName: \"kubernetes.io/projected/6ff717d6-72df-477f-adb7-abb753ea464d-kube-api-access-f8gtz\") on node \"crc\" DevicePath \"\"" Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.753713 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" event={"ID":"6ff717d6-72df-477f-adb7-abb753ea464d","Type":"ContainerDied","Data":"51ee270e3a27371a3bd58614faeba81dd7ce3eb6052f49edea9d22a701908d6e"} Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.753756 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561898-gg8mq" Mar 17 02:18:04 crc kubenswrapper[4755]: I0317 02:18:04.753765 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51ee270e3a27371a3bd58614faeba81dd7ce3eb6052f49edea9d22a701908d6e" Mar 17 02:18:05 crc kubenswrapper[4755]: I0317 02:18:05.248884 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:18:05 crc kubenswrapper[4755]: E0317 02:18:05.249518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:18:05 crc kubenswrapper[4755]: I0317 02:18:05.268657 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vwwc8"] Mar 17 02:18:05 crc kubenswrapper[4755]: I0317 02:18:05.280713 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561892-vwwc8"] Mar 17 02:18:06 crc kubenswrapper[4755]: I0317 02:18:06.276384 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49078958-034c-4e30-bf7a-8a8d455564dc" path="/var/lib/kubelet/pods/49078958-034c-4e30-bf7a-8a8d455564dc/volumes" Mar 17 02:18:20 crc kubenswrapper[4755]: I0317 02:18:20.249665 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:18:20 crc kubenswrapper[4755]: E0317 02:18:20.250922 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:18:20 crc kubenswrapper[4755]: I0317 02:18:20.332545 4755 scope.go:117] "RemoveContainer" containerID="4686ccb3ff9344edd2ca51666db01c40d785711eb3cb2c000b7e4dbdc8ea2a77" Mar 17 02:18:32 crc kubenswrapper[4755]: I0317 02:18:32.251668 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:18:33 crc kubenswrapper[4755]: I0317 02:18:33.137194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018"} Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.152760 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561900-bn848"] Mar 17 02:20:00 crc kubenswrapper[4755]: E0317 02:20:00.153743 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff717d6-72df-477f-adb7-abb753ea464d" containerName="oc" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.153760 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff717d6-72df-477f-adb7-abb753ea464d" containerName="oc" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.153982 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff717d6-72df-477f-adb7-abb753ea464d" containerName="oc" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.155502 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.158743 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.159004 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.160434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.190565 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-bn848"] Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.223493 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns7sm\" (UniqueName: \"kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm\") pod \"auto-csr-approver-29561900-bn848\" (UID: \"b154988e-c167-408a-afc3-f18c6088686c\") " pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.324688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns7sm\" (UniqueName: \"kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm\") pod \"auto-csr-approver-29561900-bn848\" (UID: \"b154988e-c167-408a-afc3-f18c6088686c\") " pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.351094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns7sm\" (UniqueName: \"kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm\") pod \"auto-csr-approver-29561900-bn848\" (UID: \"b154988e-c167-408a-afc3-f18c6088686c\") " pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:00 crc kubenswrapper[4755]: I0317 02:20:00.498193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:01 crc kubenswrapper[4755]: I0317 02:20:01.065223 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-bn848"] Mar 17 02:20:01 crc kubenswrapper[4755]: I0317 02:20:01.328283 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-bn848" event={"ID":"b154988e-c167-408a-afc3-f18c6088686c","Type":"ContainerStarted","Data":"60ae3eb56da998473ca8f6f987814b1fc17ee684b69bb502af46ee38232fc0c3"} Mar 17 02:20:03 crc kubenswrapper[4755]: I0317 02:20:03.367955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-bn848" event={"ID":"b154988e-c167-408a-afc3-f18c6088686c","Type":"ContainerDied","Data":"392a17e9dc4804241545415bcd4ee2db61b2210aeb9f26b515db978c4668df77"} Mar 17 02:20:03 crc kubenswrapper[4755]: I0317 02:20:03.367781 4755 generic.go:334] "Generic (PLEG): container finished" podID="b154988e-c167-408a-afc3-f18c6088686c" containerID="392a17e9dc4804241545415bcd4ee2db61b2210aeb9f26b515db978c4668df77" exitCode=0 Mar 17 02:20:04 crc kubenswrapper[4755]: I0317 02:20:04.915565 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:04 crc kubenswrapper[4755]: I0317 02:20:04.939935 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns7sm\" (UniqueName: \"kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm\") pod \"b154988e-c167-408a-afc3-f18c6088686c\" (UID: \"b154988e-c167-408a-afc3-f18c6088686c\") " Mar 17 02:20:04 crc kubenswrapper[4755]: I0317 02:20:04.950746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm" (OuterVolumeSpecName: "kube-api-access-ns7sm") pod "b154988e-c167-408a-afc3-f18c6088686c" (UID: "b154988e-c167-408a-afc3-f18c6088686c"). InnerVolumeSpecName "kube-api-access-ns7sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:20:05 crc kubenswrapper[4755]: I0317 02:20:05.043196 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns7sm\" (UniqueName: \"kubernetes.io/projected/b154988e-c167-408a-afc3-f18c6088686c-kube-api-access-ns7sm\") on node \"crc\" DevicePath \"\"" Mar 17 02:20:05 crc kubenswrapper[4755]: I0317 02:20:05.399655 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561900-bn848" event={"ID":"b154988e-c167-408a-afc3-f18c6088686c","Type":"ContainerDied","Data":"60ae3eb56da998473ca8f6f987814b1fc17ee684b69bb502af46ee38232fc0c3"} Mar 17 02:20:05 crc kubenswrapper[4755]: I0317 02:20:05.399720 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ae3eb56da998473ca8f6f987814b1fc17ee684b69bb502af46ee38232fc0c3" Mar 17 02:20:05 crc kubenswrapper[4755]: I0317 02:20:05.399772 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561900-bn848" Mar 17 02:20:06 crc kubenswrapper[4755]: I0317 02:20:06.024594 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-wbwrx"] Mar 17 02:20:06 crc kubenswrapper[4755]: I0317 02:20:06.044082 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561894-wbwrx"] Mar 17 02:20:06 crc kubenswrapper[4755]: I0317 02:20:06.275726 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e2dbc4-2f19-4595-b789-827ae99a8fde" path="/var/lib/kubelet/pods/07e2dbc4-2f19-4595-b789-827ae99a8fde/volumes" Mar 17 02:20:20 crc kubenswrapper[4755]: I0317 02:20:20.501580 4755 scope.go:117] "RemoveContainer" containerID="b28e2b6c968677e9d549a8719247ee8d18b1c5b2c672eca71369e8e332df21ea" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.206921 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:20:46 crc kubenswrapper[4755]: E0317 02:20:46.208080 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b154988e-c167-408a-afc3-f18c6088686c" containerName="oc" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.208098 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b154988e-c167-408a-afc3-f18c6088686c" containerName="oc" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.208383 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b154988e-c167-408a-afc3-f18c6088686c" containerName="oc" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.210500 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.223773 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.364956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.365268 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.365348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft8fw\" (UniqueName: \"kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.467160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.467214 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft8fw\" (UniqueName: \"kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.467392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.467835 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.468062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.488801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft8fw\" (UniqueName: \"kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw\") pod \"community-operators-j5r5z\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:46 crc kubenswrapper[4755]: I0317 02:20:46.581502 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:47 crc kubenswrapper[4755]: I0317 02:20:47.170323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:20:47 crc kubenswrapper[4755]: I0317 02:20:47.952626 4755 generic.go:334] "Generic (PLEG): container finished" podID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerID="972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37" exitCode=0 Mar 17 02:20:47 crc kubenswrapper[4755]: I0317 02:20:47.952715 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerDied","Data":"972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37"} Mar 17 02:20:47 crc kubenswrapper[4755]: I0317 02:20:47.952881 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerStarted","Data":"c9e72c267cc0537a511448ce674f7af38568b994a02eeeba64e09f62112780ad"} Mar 17 02:20:47 crc kubenswrapper[4755]: I0317 02:20:47.955647 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:20:48 crc kubenswrapper[4755]: I0317 02:20:48.965585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerStarted","Data":"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561"} Mar 17 02:20:50 crc kubenswrapper[4755]: I0317 02:20:50.989398 4755 generic.go:334] "Generic (PLEG): container finished" podID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerID="69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561" exitCode=0 Mar 17 02:20:50 crc kubenswrapper[4755]: I0317 02:20:50.989574 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerDied","Data":"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561"} Mar 17 02:20:52 crc kubenswrapper[4755]: I0317 02:20:52.006296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerStarted","Data":"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba"} Mar 17 02:20:52 crc kubenswrapper[4755]: I0317 02:20:52.043119 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5r5z" podStartSLOduration=2.560995777 podStartE2EDuration="6.043095193s" podCreationTimestamp="2026-03-17 02:20:46 +0000 UTC" firstStartedPulling="2026-03-17 02:20:47.955244806 +0000 UTC m=+7122.714697089" lastFinishedPulling="2026-03-17 02:20:51.437344182 +0000 UTC m=+7126.196796505" observedRunningTime="2026-03-17 02:20:52.034776948 +0000 UTC m=+7126.794229261" watchObservedRunningTime="2026-03-17 02:20:52.043095193 +0000 UTC m=+7126.802547486" Mar 17 02:20:56 crc kubenswrapper[4755]: I0317 02:20:56.582137 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:56 crc kubenswrapper[4755]: I0317 02:20:56.582920 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:20:57 crc kubenswrapper[4755]: I0317 02:20:57.651771 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-j5r5z" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="registry-server" probeResult="failure" output=< Mar 17 02:20:57 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:20:57 crc kubenswrapper[4755]: > Mar 17 02:20:58 crc kubenswrapper[4755]: I0317 02:20:58.665635 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:20:58 crc kubenswrapper[4755]: I0317 02:20:58.665700 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:21:06 crc kubenswrapper[4755]: I0317 02:21:06.648691 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:21:06 crc kubenswrapper[4755]: I0317 02:21:06.719131 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:21:06 crc kubenswrapper[4755]: I0317 02:21:06.893954 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.200657 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5r5z" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="registry-server" containerID="cri-o://fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba" gracePeriod=2 Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.825670 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.845168 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft8fw\" (UniqueName: \"kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw\") pod \"8942b7b9-b3c7-460c-8885-0de90859fee2\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.845387 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content\") pod \"8942b7b9-b3c7-460c-8885-0de90859fee2\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.845462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities\") pod \"8942b7b9-b3c7-460c-8885-0de90859fee2\" (UID: \"8942b7b9-b3c7-460c-8885-0de90859fee2\") " Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.846867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities" (OuterVolumeSpecName: "utilities") pod "8942b7b9-b3c7-460c-8885-0de90859fee2" (UID: "8942b7b9-b3c7-460c-8885-0de90859fee2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.854283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw" (OuterVolumeSpecName: "kube-api-access-ft8fw") pod "8942b7b9-b3c7-460c-8885-0de90859fee2" (UID: "8942b7b9-b3c7-460c-8885-0de90859fee2"). InnerVolumeSpecName "kube-api-access-ft8fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.922746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8942b7b9-b3c7-460c-8885-0de90859fee2" (UID: "8942b7b9-b3c7-460c-8885-0de90859fee2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.947003 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.947039 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8942b7b9-b3c7-460c-8885-0de90859fee2-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:08 crc kubenswrapper[4755]: I0317 02:21:08.947051 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft8fw\" (UniqueName: \"kubernetes.io/projected/8942b7b9-b3c7-460c-8885-0de90859fee2-kube-api-access-ft8fw\") on node \"crc\" DevicePath \"\"" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.212481 4755 generic.go:334] "Generic (PLEG): container finished" podID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerID="fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba" exitCode=0 Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.212525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerDied","Data":"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba"} Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.212557 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5r5z" event={"ID":"8942b7b9-b3c7-460c-8885-0de90859fee2","Type":"ContainerDied","Data":"c9e72c267cc0537a511448ce674f7af38568b994a02eeeba64e09f62112780ad"} Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.212579 4755 scope.go:117] "RemoveContainer" containerID="fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.212587 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5r5z" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.245367 4755 scope.go:117] "RemoveContainer" containerID="69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.267679 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.273825 4755 scope.go:117] "RemoveContainer" containerID="972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.287616 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5r5z"] Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.342189 4755 scope.go:117] "RemoveContainer" containerID="fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba" Mar 17 02:21:09 crc kubenswrapper[4755]: E0317 02:21:09.342553 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba\": container with ID starting with fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba not found: ID does not exist" containerID="fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.342589 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba"} err="failed to get container status \"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba\": rpc error: code = NotFound desc = could not find container \"fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba\": container with ID starting with fa92ae6cbde014e7c3317277014c932ece521a861e4e0c6006bfa3fad82679ba not found: ID does not exist" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.342612 4755 scope.go:117] "RemoveContainer" containerID="69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561" Mar 17 02:21:09 crc kubenswrapper[4755]: E0317 02:21:09.342857 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561\": container with ID starting with 69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561 not found: ID does not exist" containerID="69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.342878 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561"} err="failed to get container status \"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561\": rpc error: code = NotFound desc = could not find container \"69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561\": container with ID starting with 69de0d7531878d1d9a16e23cdb990244533af9af746af463285270323bd2a561 not found: ID does not exist" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.342894 4755 scope.go:117] "RemoveContainer" containerID="972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37" Mar 17 02:21:09 crc kubenswrapper[4755]: E0317 02:21:09.343124 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37\": container with ID starting with 972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37 not found: ID does not exist" containerID="972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37" Mar 17 02:21:09 crc kubenswrapper[4755]: I0317 02:21:09.343151 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37"} err="failed to get container status \"972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37\": rpc error: code = NotFound desc = could not find container \"972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37\": container with ID starting with 972ecd5d462355e164739edb301c53e07d54c2ef1db3ce68c92cc3785772ea37 not found: ID does not exist" Mar 17 02:21:10 crc kubenswrapper[4755]: I0317 02:21:10.266160 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" path="/var/lib/kubelet/pods/8942b7b9-b3c7-460c-8885-0de90859fee2/volumes" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.634867 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:21:27 crc kubenswrapper[4755]: E0317 02:21:27.636423 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="registry-server" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.636485 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="registry-server" Mar 17 02:21:27 crc kubenswrapper[4755]: E0317 02:21:27.636551 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="extract-utilities" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.636565 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="extract-utilities" Mar 17 02:21:27 crc kubenswrapper[4755]: E0317 02:21:27.636590 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="extract-content" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.636605 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="extract-content" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.637003 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8942b7b9-b3c7-460c-8885-0de90859fee2" containerName="registry-server" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.639910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.648867 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.729151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2bw\" (UniqueName: \"kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.729662 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.729874 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.831927 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.832010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.832344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2bw\" (UniqueName: \"kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.832837 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.832833 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.860531 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2bw\" (UniqueName: \"kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw\") pod \"redhat-operators-qzkmq\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:27 crc kubenswrapper[4755]: I0317 02:21:27.959682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:28 crc kubenswrapper[4755]: I0317 02:21:28.459320 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:21:28 crc kubenswrapper[4755]: I0317 02:21:28.665759 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:21:28 crc kubenswrapper[4755]: I0317 02:21:28.666092 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:21:29 crc kubenswrapper[4755]: I0317 02:21:29.464873 4755 generic.go:334] "Generic (PLEG): container finished" podID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerID="3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b" exitCode=0 Mar 17 02:21:29 crc kubenswrapper[4755]: I0317 02:21:29.464950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerDied","Data":"3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b"} Mar 17 02:21:29 crc kubenswrapper[4755]: I0317 02:21:29.465226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerStarted","Data":"74aadbf916224f44a77f9fb31643d1696fe485f6559e67094815f2bdd390ff22"} Mar 17 02:21:30 crc kubenswrapper[4755]: I0317 02:21:30.478825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerStarted","Data":"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14"} Mar 17 02:21:35 crc kubenswrapper[4755]: I0317 02:21:35.548920 4755 generic.go:334] "Generic (PLEG): container finished" podID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerID="13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14" exitCode=0 Mar 17 02:21:35 crc kubenswrapper[4755]: I0317 02:21:35.548993 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerDied","Data":"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14"} Mar 17 02:21:36 crc kubenswrapper[4755]: I0317 02:21:36.561623 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerStarted","Data":"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb"} Mar 17 02:21:36 crc kubenswrapper[4755]: I0317 02:21:36.586360 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzkmq" podStartSLOduration=3.113374584 podStartE2EDuration="9.586344866s" podCreationTimestamp="2026-03-17 02:21:27 +0000 UTC" firstStartedPulling="2026-03-17 02:21:29.467284774 +0000 UTC m=+7164.226737057" lastFinishedPulling="2026-03-17 02:21:35.940255056 +0000 UTC m=+7170.699707339" observedRunningTime="2026-03-17 02:21:36.583841758 +0000 UTC m=+7171.343294041" watchObservedRunningTime="2026-03-17 02:21:36.586344866 +0000 UTC m=+7171.345797139" Mar 17 02:21:37 crc kubenswrapper[4755]: I0317 02:21:37.960749 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:37 crc kubenswrapper[4755]: I0317 02:21:37.961362 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:21:39 crc kubenswrapper[4755]: I0317 02:21:39.048465 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzkmq" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" probeResult="failure" output=< Mar 17 02:21:39 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:21:39 crc kubenswrapper[4755]: > Mar 17 02:21:49 crc kubenswrapper[4755]: I0317 02:21:49.005145 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzkmq" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" probeResult="failure" output=< Mar 17 02:21:49 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:21:49 crc kubenswrapper[4755]: > Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.665111 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.665814 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.665877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.667678 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.667793 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018" gracePeriod=600 Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.850101 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018" exitCode=0 Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.850181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018"} Mar 17 02:21:58 crc kubenswrapper[4755]: I0317 02:21:58.850281 4755 scope.go:117] "RemoveContainer" containerID="5a565ab627a29beb217d89f984d97734753912fb973bcf888ddd746b996cf8f3" Mar 17 02:21:59 crc kubenswrapper[4755]: I0317 02:21:59.037130 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzkmq" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" probeResult="failure" output=< Mar 17 02:21:59 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:21:59 crc kubenswrapper[4755]: > Mar 17 02:21:59 crc kubenswrapper[4755]: I0317 02:21:59.863878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943"} Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.145543 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561902-5t5jh"] Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.147760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.150644 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.150644 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.151328 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.155880 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-5t5jh"] Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.336122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhvt\" (UniqueName: \"kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt\") pod \"auto-csr-approver-29561902-5t5jh\" (UID: \"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0\") " pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.438518 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhvt\" (UniqueName: \"kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt\") pod \"auto-csr-approver-29561902-5t5jh\" (UID: \"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0\") " pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.473986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhvt\" (UniqueName: \"kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt\") pod \"auto-csr-approver-29561902-5t5jh\" (UID: \"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0\") " pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:00 crc kubenswrapper[4755]: I0317 02:22:00.483227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:01 crc kubenswrapper[4755]: W0317 02:22:01.050863 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4733fbe_af6f_4e6c_89ea_afef79ac1ad0.slice/crio-ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18 WatchSource:0}: Error finding container ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18: Status 404 returned error can't find the container with id ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18 Mar 17 02:22:01 crc kubenswrapper[4755]: I0317 02:22:01.052481 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-5t5jh"] Mar 17 02:22:01 crc kubenswrapper[4755]: I0317 02:22:01.887546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" event={"ID":"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0","Type":"ContainerStarted","Data":"ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18"} Mar 17 02:22:02 crc kubenswrapper[4755]: I0317 02:22:02.902875 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" event={"ID":"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0","Type":"ContainerStarted","Data":"da4d76c7154954e1faee31e6ea0b1a9a080ad2557ab5e0f2a201dc64e077d522"} Mar 17 02:22:02 crc kubenswrapper[4755]: I0317 02:22:02.925109 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" podStartSLOduration=2.042740719 podStartE2EDuration="2.925087102s" podCreationTimestamp="2026-03-17 02:22:00 +0000 UTC" firstStartedPulling="2026-03-17 02:22:01.053278667 +0000 UTC m=+7195.812730950" lastFinishedPulling="2026-03-17 02:22:01.93562502 +0000 UTC m=+7196.695077333" observedRunningTime="2026-03-17 02:22:02.918127055 +0000 UTC m=+7197.677579378" watchObservedRunningTime="2026-03-17 02:22:02.925087102 +0000 UTC m=+7197.684539385" Mar 17 02:22:03 crc kubenswrapper[4755]: I0317 02:22:03.926790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" event={"ID":"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0","Type":"ContainerDied","Data":"da4d76c7154954e1faee31e6ea0b1a9a080ad2557ab5e0f2a201dc64e077d522"} Mar 17 02:22:03 crc kubenswrapper[4755]: I0317 02:22:03.926598 4755 generic.go:334] "Generic (PLEG): container finished" podID="e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" containerID="da4d76c7154954e1faee31e6ea0b1a9a080ad2557ab5e0f2a201dc64e077d522" exitCode=0 Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.428367 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.489537 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vhvt\" (UniqueName: \"kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt\") pod \"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0\" (UID: \"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0\") " Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.499501 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt" (OuterVolumeSpecName: "kube-api-access-8vhvt") pod "e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" (UID: "e4733fbe-af6f-4e6c-89ea-afef79ac1ad0"). InnerVolumeSpecName "kube-api-access-8vhvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.594195 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vhvt\" (UniqueName: \"kubernetes.io/projected/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0-kube-api-access-8vhvt\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.952550 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" event={"ID":"e4733fbe-af6f-4e6c-89ea-afef79ac1ad0","Type":"ContainerDied","Data":"ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18"} Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.952600 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4a01a7896258aa8be4159c27fe374c4e5c6a21dcb999d5be73327da7ad6e18" Mar 17 02:22:05 crc kubenswrapper[4755]: I0317 02:22:05.952602 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561902-5t5jh" Mar 17 02:22:06 crc kubenswrapper[4755]: I0317 02:22:06.007549 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-z6wmk"] Mar 17 02:22:06 crc kubenswrapper[4755]: I0317 02:22:06.018967 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561896-z6wmk"] Mar 17 02:22:06 crc kubenswrapper[4755]: I0317 02:22:06.268659 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1148c0-e5c9-4aab-b273-91bbdfeb062b" path="/var/lib/kubelet/pods/af1148c0-e5c9-4aab-b273-91bbdfeb062b/volumes" Mar 17 02:22:08 crc kubenswrapper[4755]: I0317 02:22:08.051358 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:22:08 crc kubenswrapper[4755]: I0317 02:22:08.128121 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:22:08 crc kubenswrapper[4755]: I0317 02:22:08.311394 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.024237 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qzkmq" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" containerID="cri-o://a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb" gracePeriod=2 Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.762588 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.846783 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2bw\" (UniqueName: \"kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw\") pod \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.847024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content\") pod \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.847106 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities\") pod \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\" (UID: \"f04565e2-494f-4cb0-83f8-ab6dc5c5df95\") " Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.851697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities" (OuterVolumeSpecName: "utilities") pod "f04565e2-494f-4cb0-83f8-ab6dc5c5df95" (UID: "f04565e2-494f-4cb0-83f8-ab6dc5c5df95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.861816 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw" (OuterVolumeSpecName: "kube-api-access-4t2bw") pod "f04565e2-494f-4cb0-83f8-ab6dc5c5df95" (UID: "f04565e2-494f-4cb0-83f8-ab6dc5c5df95"). InnerVolumeSpecName "kube-api-access-4t2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.951062 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:10 crc kubenswrapper[4755]: I0317 02:22:10.951098 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2bw\" (UniqueName: \"kubernetes.io/projected/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-kube-api-access-4t2bw\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.034377 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f04565e2-494f-4cb0-83f8-ab6dc5c5df95" (UID: "f04565e2-494f-4cb0-83f8-ab6dc5c5df95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.036736 4755 generic.go:334] "Generic (PLEG): container finished" podID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerID="a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb" exitCode=0 Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.036787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerDied","Data":"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb"} Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.036841 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzkmq" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.037151 4755 scope.go:117] "RemoveContainer" containerID="a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.037060 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzkmq" event={"ID":"f04565e2-494f-4cb0-83f8-ab6dc5c5df95","Type":"ContainerDied","Data":"74aadbf916224f44a77f9fb31643d1696fe485f6559e67094815f2bdd390ff22"} Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.053033 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04565e2-494f-4cb0-83f8-ab6dc5c5df95-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.081649 4755 scope.go:117] "RemoveContainer" containerID="13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.093656 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.107347 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qzkmq"] Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.128332 4755 scope.go:117] "RemoveContainer" containerID="3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.194481 4755 scope.go:117] "RemoveContainer" containerID="a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb" Mar 17 02:22:11 crc kubenswrapper[4755]: E0317 02:22:11.195072 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb\": container with ID starting with a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb not found: ID does not exist" containerID="a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.195171 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb"} err="failed to get container status \"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb\": rpc error: code = NotFound desc = could not find container \"a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb\": container with ID starting with a5b6ae8ebdcaa6c5bbffcb5090a86f33002d83eb0ab0f2556baff49bc3a62bdb not found: ID does not exist" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.195215 4755 scope.go:117] "RemoveContainer" containerID="13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14" Mar 17 02:22:11 crc kubenswrapper[4755]: E0317 02:22:11.195720 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14\": container with ID starting with 13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14 not found: ID does not exist" containerID="13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.195843 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14"} err="failed to get container status \"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14\": rpc error: code = NotFound desc = could not find container \"13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14\": container with ID starting with 13795d7175a9e5512ed1d252fb9ee44af78b3a8a18b7f7989071d1599da07f14 not found: ID does not exist" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.195959 4755 scope.go:117] "RemoveContainer" containerID="3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b" Mar 17 02:22:11 crc kubenswrapper[4755]: E0317 02:22:11.196631 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b\": container with ID starting with 3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b not found: ID does not exist" containerID="3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b" Mar 17 02:22:11 crc kubenswrapper[4755]: I0317 02:22:11.196669 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b"} err="failed to get container status \"3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b\": rpc error: code = NotFound desc = could not find container \"3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b\": container with ID starting with 3ec12b3e8f58176a8e44d362d7f6bf831e2ad70121d2bc2073f73f8b2262528b not found: ID does not exist" Mar 17 02:22:12 crc kubenswrapper[4755]: I0317 02:22:12.266631 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" path="/var/lib/kubelet/pods/f04565e2-494f-4cb0-83f8-ab6dc5c5df95/volumes" Mar 17 02:22:20 crc kubenswrapper[4755]: I0317 02:22:20.642557 4755 scope.go:117] "RemoveContainer" containerID="e21bdc07bd33d28639ecd6680f7f302cf666a91c88b3b9f96f8550df6d06ae90" Mar 17 02:22:53 crc kubenswrapper[4755]: I0317 02:22:53.585509 4755 generic.go:334] "Generic (PLEG): container finished" podID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" containerID="db20154150d812ff9e979bf4b4105a2a23f652443d68ed255fa3cfc56e3220a9" exitCode=0 Mar 17 02:22:53 crc kubenswrapper[4755]: I0317 02:22:53.585671 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d","Type":"ContainerDied","Data":"db20154150d812ff9e979bf4b4105a2a23f652443d68ed255fa3cfc56e3220a9"} Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.119909 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.245972 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246146 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246241 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.246362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwxzg\" (UniqueName: \"kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.247013 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.247110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.247166 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data\") pod \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\" (UID: \"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d\") " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.248751 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data" (OuterVolumeSpecName: "config-data") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.248881 4755 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.255188 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.256003 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.256245 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg" (OuterVolumeSpecName: "kube-api-access-zwxzg") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "kube-api-access-zwxzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.283397 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.286622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.299942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.320209 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" (UID: "d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351683 4755 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351763 4755 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351784 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwxzg\" (UniqueName: \"kubernetes.io/projected/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-kube-api-access-zwxzg\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351804 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351820 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351835 4755 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.351852 4755 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.352611 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.380171 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.454548 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.614613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d","Type":"ContainerDied","Data":"215a855e3220f481f185e5b5417ab2f0954c5853d9fe6ef7b7fc477c8b3acc07"} Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.614653 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="215a855e3220f481f185e5b5417ab2f0954c5853d9fe6ef7b7fc477c8b3acc07" Mar 17 02:22:55 crc kubenswrapper[4755]: I0317 02:22:55.614710 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.151888 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 02:23:04 crc kubenswrapper[4755]: E0317 02:23:04.153007 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" containerName="oc" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153023 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" containerName="oc" Mar 17 02:23:04 crc kubenswrapper[4755]: E0317 02:23:04.153034 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="extract-utilities" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153043 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="extract-utilities" Mar 17 02:23:04 crc kubenswrapper[4755]: E0317 02:23:04.153061 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="extract-content" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153073 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="extract-content" Mar 17 02:23:04 crc kubenswrapper[4755]: E0317 02:23:04.153096 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153106 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" Mar 17 02:23:04 crc kubenswrapper[4755]: E0317 02:23:04.153145 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153153 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153409 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" containerName="oc" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153426 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04565e2-494f-4cb0-83f8-ab6dc5c5df95" containerName="registry-server" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.153471 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d" containerName="tempest-tests-tempest-tests-runner" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.154334 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.157661 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lcx6m" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.172680 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.277100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pts\" (UniqueName: \"kubernetes.io/projected/b612dcdd-df72-4c24-827f-44d916531556-kube-api-access-z8pts\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.277232 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.379514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.380217 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pts\" (UniqueName: \"kubernetes.io/projected/b612dcdd-df72-4c24-827f-44d916531556-kube-api-access-z8pts\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.381547 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.414845 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pts\" (UniqueName: \"kubernetes.io/projected/b612dcdd-df72-4c24-827f-44d916531556-kube-api-access-z8pts\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.425306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b612dcdd-df72-4c24-827f-44d916531556\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:04 crc kubenswrapper[4755]: I0317 02:23:04.489804 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 02:23:05 crc kubenswrapper[4755]: I0317 02:23:05.063424 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 02:23:05 crc kubenswrapper[4755]: I0317 02:23:05.768001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b612dcdd-df72-4c24-827f-44d916531556","Type":"ContainerStarted","Data":"bf320f9fc12e0705e7d785ea0262ca11d22e69383a980433c84734260ccf93de"} Mar 17 02:23:06 crc kubenswrapper[4755]: I0317 02:23:06.780590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b612dcdd-df72-4c24-827f-44d916531556","Type":"ContainerStarted","Data":"f399f93ad9871c33c45569b251a806d4f0aa219d922f2c29f5952ce89faab421"} Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.927885 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=28.844335585 podStartE2EDuration="29.927866165s" podCreationTimestamp="2026-03-17 02:23:04 +0000 UTC" firstStartedPulling="2026-03-17 02:23:05.069193868 +0000 UTC m=+7259.828646181" lastFinishedPulling="2026-03-17 02:23:06.152724478 +0000 UTC m=+7260.912176761" observedRunningTime="2026-03-17 02:23:06.802775804 +0000 UTC m=+7261.562228097" watchObservedRunningTime="2026-03-17 02:23:33.927866165 +0000 UTC m=+7288.687318448" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.929899 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n8w6v/must-gather-m5qx9"] Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.933714 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.938014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n8w6v"/"kube-root-ca.crt" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.938242 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n8w6v"/"openshift-service-ca.crt" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.938360 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n8w6v"/"default-dockercfg-p9xxv" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.996414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q7tr\" (UniqueName: \"kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:33 crc kubenswrapper[4755]: I0317 02:23:33.996522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.098170 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.098479 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q7tr\" (UniqueName: \"kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.098681 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.120099 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q7tr\" (UniqueName: \"kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr\") pod \"must-gather-m5qx9\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.133244 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n8w6v/must-gather-m5qx9"] Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.256298 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:23:34 crc kubenswrapper[4755]: I0317 02:23:34.787527 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n8w6v/must-gather-m5qx9"] Mar 17 02:23:35 crc kubenswrapper[4755]: I0317 02:23:35.191963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" event={"ID":"e13d9991-5993-4f00-9918-5ec7ad366a9f","Type":"ContainerStarted","Data":"a9e5ccbcb0b7f43fb144c891423681e73d56e5b48e0057b585c650e4cc93154c"} Mar 17 02:23:42 crc kubenswrapper[4755]: I0317 02:23:42.277886 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" event={"ID":"e13d9991-5993-4f00-9918-5ec7ad366a9f","Type":"ContainerStarted","Data":"eac08d5f39ffe41a82694570c2964058d5e5e6a6d3864bc60b6fd305ab1bc1ec"} Mar 17 02:23:43 crc kubenswrapper[4755]: I0317 02:23:43.292308 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" event={"ID":"e13d9991-5993-4f00-9918-5ec7ad366a9f","Type":"ContainerStarted","Data":"ab28e43539fc149171fd63bde026497e6cdc722455cedcce9a5fd5ad1e821fe5"} Mar 17 02:23:43 crc kubenswrapper[4755]: I0317 02:23:43.330616 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" podStartSLOduration=3.269774183 podStartE2EDuration="10.330598802s" podCreationTimestamp="2026-03-17 02:23:33 +0000 UTC" firstStartedPulling="2026-03-17 02:23:34.790146007 +0000 UTC m=+7289.549598300" lastFinishedPulling="2026-03-17 02:23:41.850970626 +0000 UTC m=+7296.610422919" observedRunningTime="2026-03-17 02:23:43.322189125 +0000 UTC m=+7298.081641408" watchObservedRunningTime="2026-03-17 02:23:43.330598802 +0000 UTC m=+7298.090051085" Mar 17 02:23:49 crc kubenswrapper[4755]: I0317 02:23:49.972037 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hng84"] Mar 17 02:23:49 crc kubenswrapper[4755]: I0317 02:23:49.974195 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.001271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.001432 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.104496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.104660 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.106835 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.138408 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg\") pod \"crc-debug-hng84\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.308489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:23:50 crc kubenswrapper[4755]: I0317 02:23:50.369358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-hng84" event={"ID":"e93408b7-a465-4ade-a595-58e4b83ae3b9","Type":"ContainerStarted","Data":"229086abc4015f860bc9ac72836ddff48e199f9ec803792ee5c438986e85b85d"} Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.139797 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561904-859gk"] Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.141673 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.144455 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.144697 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.144824 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.163665 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-859gk"] Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.259868 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbj4\" (UniqueName: \"kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4\") pod \"auto-csr-approver-29561904-859gk\" (UID: \"3a322944-18dc-41dc-9b4b-115d30dd0fc7\") " pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.362417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbj4\" (UniqueName: \"kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4\") pod \"auto-csr-approver-29561904-859gk\" (UID: \"3a322944-18dc-41dc-9b4b-115d30dd0fc7\") " pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.390236 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbj4\" (UniqueName: \"kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4\") pod \"auto-csr-approver-29561904-859gk\" (UID: \"3a322944-18dc-41dc-9b4b-115d30dd0fc7\") " pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:00 crc kubenswrapper[4755]: I0317 02:24:00.464732 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:03 crc kubenswrapper[4755]: I0317 02:24:03.494167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-hng84" event={"ID":"e93408b7-a465-4ade-a595-58e4b83ae3b9","Type":"ContainerStarted","Data":"f86872095970fd4975c81cd671468253c1c7665556cfe904f349ed7ee92c6325"} Mar 17 02:24:03 crc kubenswrapper[4755]: I0317 02:24:03.508500 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-859gk"] Mar 17 02:24:03 crc kubenswrapper[4755]: I0317 02:24:03.522114 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n8w6v/crc-debug-hng84" podStartSLOduration=1.882163239 podStartE2EDuration="14.522099275s" podCreationTimestamp="2026-03-17 02:23:49 +0000 UTC" firstStartedPulling="2026-03-17 02:23:50.348812071 +0000 UTC m=+7305.108264354" lastFinishedPulling="2026-03-17 02:24:02.988748117 +0000 UTC m=+7317.748200390" observedRunningTime="2026-03-17 02:24:03.51155602 +0000 UTC m=+7318.271008303" watchObservedRunningTime="2026-03-17 02:24:03.522099275 +0000 UTC m=+7318.281551558" Mar 17 02:24:03 crc kubenswrapper[4755]: W0317 02:24:03.526423 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a322944_18dc_41dc_9b4b_115d30dd0fc7.slice/crio-b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0 WatchSource:0}: Error finding container b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0: Status 404 returned error can't find the container with id b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0 Mar 17 02:24:04 crc kubenswrapper[4755]: I0317 02:24:04.507715 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-859gk" event={"ID":"3a322944-18dc-41dc-9b4b-115d30dd0fc7","Type":"ContainerStarted","Data":"b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0"} Mar 17 02:24:05 crc kubenswrapper[4755]: I0317 02:24:05.572133 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-859gk" event={"ID":"3a322944-18dc-41dc-9b4b-115d30dd0fc7","Type":"ContainerStarted","Data":"4a6d4d43229f1d3661f835d3d5494da2d6f45954775efa3466491f7663e478e2"} Mar 17 02:24:05 crc kubenswrapper[4755]: I0317 02:24:05.630586 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561904-859gk" podStartSLOduration=4.581250978 podStartE2EDuration="5.630569525s" podCreationTimestamp="2026-03-17 02:24:00 +0000 UTC" firstStartedPulling="2026-03-17 02:24:03.53301985 +0000 UTC m=+7318.292472133" lastFinishedPulling="2026-03-17 02:24:04.582338377 +0000 UTC m=+7319.341790680" observedRunningTime="2026-03-17 02:24:05.620175894 +0000 UTC m=+7320.379628177" watchObservedRunningTime="2026-03-17 02:24:05.630569525 +0000 UTC m=+7320.390021808" Mar 17 02:24:06 crc kubenswrapper[4755]: I0317 02:24:06.583348 4755 generic.go:334] "Generic (PLEG): container finished" podID="3a322944-18dc-41dc-9b4b-115d30dd0fc7" containerID="4a6d4d43229f1d3661f835d3d5494da2d6f45954775efa3466491f7663e478e2" exitCode=0 Mar 17 02:24:06 crc kubenswrapper[4755]: I0317 02:24:06.583414 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-859gk" event={"ID":"3a322944-18dc-41dc-9b4b-115d30dd0fc7","Type":"ContainerDied","Data":"4a6d4d43229f1d3661f835d3d5494da2d6f45954775efa3466491f7663e478e2"} Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.003843 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.185427 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gbj4\" (UniqueName: \"kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4\") pod \"3a322944-18dc-41dc-9b4b-115d30dd0fc7\" (UID: \"3a322944-18dc-41dc-9b4b-115d30dd0fc7\") " Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.195697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4" (OuterVolumeSpecName: "kube-api-access-4gbj4") pod "3a322944-18dc-41dc-9b4b-115d30dd0fc7" (UID: "3a322944-18dc-41dc-9b4b-115d30dd0fc7"). InnerVolumeSpecName "kube-api-access-4gbj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.293855 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gbj4\" (UniqueName: \"kubernetes.io/projected/3a322944-18dc-41dc-9b4b-115d30dd0fc7-kube-api-access-4gbj4\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.603683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561904-859gk" event={"ID":"3a322944-18dc-41dc-9b4b-115d30dd0fc7","Type":"ContainerDied","Data":"b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0"} Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.604092 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56ffc876cd41d6d9a1f2afaa779f6d718f4f6e0e713f9a2701ec7ffabeca3a0" Mar 17 02:24:08 crc kubenswrapper[4755]: I0317 02:24:08.603903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561904-859gk" Mar 17 02:24:09 crc kubenswrapper[4755]: I0317 02:24:09.091402 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-gg8mq"] Mar 17 02:24:09 crc kubenswrapper[4755]: I0317 02:24:09.100901 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561898-gg8mq"] Mar 17 02:24:10 crc kubenswrapper[4755]: I0317 02:24:10.259160 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff717d6-72df-477f-adb7-abb753ea464d" path="/var/lib/kubelet/pods/6ff717d6-72df-477f-adb7-abb753ea464d/volumes" Mar 17 02:24:20 crc kubenswrapper[4755]: I0317 02:24:20.868073 4755 scope.go:117] "RemoveContainer" containerID="1e3d6d49001395af1d061f1a5eff7da46b7452deb74a4ff41772cb32608331f9" Mar 17 02:24:28 crc kubenswrapper[4755]: I0317 02:24:28.665721 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:24:28 crc kubenswrapper[4755]: I0317 02:24:28.666154 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:24:52 crc kubenswrapper[4755]: I0317 02:24:52.113110 4755 generic.go:334] "Generic (PLEG): container finished" podID="e93408b7-a465-4ade-a595-58e4b83ae3b9" containerID="f86872095970fd4975c81cd671468253c1c7665556cfe904f349ed7ee92c6325" exitCode=0 Mar 17 02:24:52 crc kubenswrapper[4755]: I0317 02:24:52.113749 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-hng84" event={"ID":"e93408b7-a465-4ade-a595-58e4b83ae3b9","Type":"ContainerDied","Data":"f86872095970fd4975c81cd671468253c1c7665556cfe904f349ed7ee92c6325"} Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.277422 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.337195 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hng84"] Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.349644 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hng84"] Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.445842 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host\") pod \"e93408b7-a465-4ade-a595-58e4b83ae3b9\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.445935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host" (OuterVolumeSpecName: "host") pod "e93408b7-a465-4ade-a595-58e4b83ae3b9" (UID: "e93408b7-a465-4ade-a595-58e4b83ae3b9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.446019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg\") pod \"e93408b7-a465-4ade-a595-58e4b83ae3b9\" (UID: \"e93408b7-a465-4ade-a595-58e4b83ae3b9\") " Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.446946 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93408b7-a465-4ade-a595-58e4b83ae3b9-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.454103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg" (OuterVolumeSpecName: "kube-api-access-r62vg") pod "e93408b7-a465-4ade-a595-58e4b83ae3b9" (UID: "e93408b7-a465-4ade-a595-58e4b83ae3b9"). InnerVolumeSpecName "kube-api-access-r62vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:24:53 crc kubenswrapper[4755]: I0317 02:24:53.549163 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r62vg\" (UniqueName: \"kubernetes.io/projected/e93408b7-a465-4ade-a595-58e4b83ae3b9-kube-api-access-r62vg\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.141530 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229086abc4015f860bc9ac72836ddff48e199f9ec803792ee5c438986e85b85d" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.141585 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hng84" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.263476 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93408b7-a465-4ade-a595-58e4b83ae3b9" path="/var/lib/kubelet/pods/e93408b7-a465-4ade-a595-58e4b83ae3b9/volumes" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.608029 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-n2fwr"] Mar 17 02:24:54 crc kubenswrapper[4755]: E0317 02:24:54.608571 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a322944-18dc-41dc-9b4b-115d30dd0fc7" containerName="oc" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.608588 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a322944-18dc-41dc-9b4b-115d30dd0fc7" containerName="oc" Mar 17 02:24:54 crc kubenswrapper[4755]: E0317 02:24:54.608618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93408b7-a465-4ade-a595-58e4b83ae3b9" containerName="container-00" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.608627 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93408b7-a465-4ade-a595-58e4b83ae3b9" containerName="container-00" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.609144 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a322944-18dc-41dc-9b4b-115d30dd0fc7" containerName="oc" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.609186 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93408b7-a465-4ade-a595-58e4b83ae3b9" containerName="container-00" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.610135 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.778168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhvg\" (UniqueName: \"kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.778501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.880455 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhvg\" (UniqueName: \"kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.880524 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.880861 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.903904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhvg\" (UniqueName: \"kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg\") pod \"crc-debug-n2fwr\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:54 crc kubenswrapper[4755]: I0317 02:24:54.932878 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:55 crc kubenswrapper[4755]: I0317 02:24:55.159748 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" event={"ID":"2ac06da0-52ad-46d0-8930-e1aa48a88f8b","Type":"ContainerStarted","Data":"4cb44a8b95d2170063373730cdcc6acc779e3a333cb1108f0a21b02773e18cbf"} Mar 17 02:24:56 crc kubenswrapper[4755]: I0317 02:24:56.176338 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ac06da0-52ad-46d0-8930-e1aa48a88f8b" containerID="ba6407a05d31d2cbd1a93d73e4cedfae291b24bbb44cfe315d33fba42d8ade6b" exitCode=0 Mar 17 02:24:56 crc kubenswrapper[4755]: I0317 02:24:56.178854 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" event={"ID":"2ac06da0-52ad-46d0-8930-e1aa48a88f8b","Type":"ContainerDied","Data":"ba6407a05d31d2cbd1a93d73e4cedfae291b24bbb44cfe315d33fba42d8ade6b"} Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.307978 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.346591 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host\") pod \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.346706 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rhvg\" (UniqueName: \"kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg\") pod \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\" (UID: \"2ac06da0-52ad-46d0-8930-e1aa48a88f8b\") " Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.347213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host" (OuterVolumeSpecName: "host") pod "2ac06da0-52ad-46d0-8930-e1aa48a88f8b" (UID: "2ac06da0-52ad-46d0-8930-e1aa48a88f8b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.360946 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg" (OuterVolumeSpecName: "kube-api-access-6rhvg") pod "2ac06da0-52ad-46d0-8930-e1aa48a88f8b" (UID: "2ac06da0-52ad-46d0-8930-e1aa48a88f8b"). InnerVolumeSpecName "kube-api-access-6rhvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.448045 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:57 crc kubenswrapper[4755]: I0317 02:24:57.448359 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rhvg\" (UniqueName: \"kubernetes.io/projected/2ac06da0-52ad-46d0-8930-e1aa48a88f8b-kube-api-access-6rhvg\") on node \"crc\" DevicePath \"\"" Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.202033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" event={"ID":"2ac06da0-52ad-46d0-8930-e1aa48a88f8b","Type":"ContainerDied","Data":"4cb44a8b95d2170063373730cdcc6acc779e3a333cb1108f0a21b02773e18cbf"} Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.202071 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb44a8b95d2170063373730cdcc6acc779e3a333cb1108f0a21b02773e18cbf" Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.202146 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-n2fwr" Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.387642 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-n2fwr"] Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.399534 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-n2fwr"] Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.665659 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:24:58 crc kubenswrapper[4755]: I0317 02:24:58.665748 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.724394 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hplzd"] Mar 17 02:24:59 crc kubenswrapper[4755]: E0317 02:24:59.724879 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac06da0-52ad-46d0-8930-e1aa48a88f8b" containerName="container-00" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.724891 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac06da0-52ad-46d0-8930-e1aa48a88f8b" containerName="container-00" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.725123 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac06da0-52ad-46d0-8930-e1aa48a88f8b" containerName="container-00" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.725933 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.910899 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:24:59 crc kubenswrapper[4755]: I0317 02:24:59.910971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ctj\" (UniqueName: \"kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.013545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.013610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ctj\" (UniqueName: \"kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.014073 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.039357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ctj\" (UniqueName: \"kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj\") pod \"crc-debug-hplzd\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.041007 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:00 crc kubenswrapper[4755]: W0317 02:25:00.073752 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c9cb2c_8dc4_441b_a796_feae3a2420ac.slice/crio-9f50d8b2dbfa292e71e66af8b3081db49e296fc50930e0dd00bcb52113372f6a WatchSource:0}: Error finding container 9f50d8b2dbfa292e71e66af8b3081db49e296fc50930e0dd00bcb52113372f6a: Status 404 returned error can't find the container with id 9f50d8b2dbfa292e71e66af8b3081db49e296fc50930e0dd00bcb52113372f6a Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.220033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" event={"ID":"26c9cb2c-8dc4-441b-a796-feae3a2420ac","Type":"ContainerStarted","Data":"9f50d8b2dbfa292e71e66af8b3081db49e296fc50930e0dd00bcb52113372f6a"} Mar 17 02:25:00 crc kubenswrapper[4755]: I0317 02:25:00.258577 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac06da0-52ad-46d0-8930-e1aa48a88f8b" path="/var/lib/kubelet/pods/2ac06da0-52ad-46d0-8930-e1aa48a88f8b/volumes" Mar 17 02:25:01 crc kubenswrapper[4755]: I0317 02:25:01.233565 4755 generic.go:334] "Generic (PLEG): container finished" podID="26c9cb2c-8dc4-441b-a796-feae3a2420ac" containerID="14fa8bfbfd55f1dee14ff15b0ac52a5f4985e432daa23fd7d8e96995faacccd7" exitCode=0 Mar 17 02:25:01 crc kubenswrapper[4755]: I0317 02:25:01.233613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" event={"ID":"26c9cb2c-8dc4-441b-a796-feae3a2420ac","Type":"ContainerDied","Data":"14fa8bfbfd55f1dee14ff15b0ac52a5f4985e432daa23fd7d8e96995faacccd7"} Mar 17 02:25:01 crc kubenswrapper[4755]: I0317 02:25:01.294145 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hplzd"] Mar 17 02:25:01 crc kubenswrapper[4755]: I0317 02:25:01.312354 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n8w6v/crc-debug-hplzd"] Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.354548 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.468750 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ctj\" (UniqueName: \"kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj\") pod \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.468859 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host\") pod \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\" (UID: \"26c9cb2c-8dc4-441b-a796-feae3a2420ac\") " Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.469176 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host" (OuterVolumeSpecName: "host") pod "26c9cb2c-8dc4-441b-a796-feae3a2420ac" (UID: "26c9cb2c-8dc4-441b-a796-feae3a2420ac"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.469703 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26c9cb2c-8dc4-441b-a796-feae3a2420ac-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.474242 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj" (OuterVolumeSpecName: "kube-api-access-v5ctj") pod "26c9cb2c-8dc4-441b-a796-feae3a2420ac" (UID: "26c9cb2c-8dc4-441b-a796-feae3a2420ac"). InnerVolumeSpecName "kube-api-access-v5ctj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:25:02 crc kubenswrapper[4755]: I0317 02:25:02.571741 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ctj\" (UniqueName: \"kubernetes.io/projected/26c9cb2c-8dc4-441b-a796-feae3a2420ac-kube-api-access-v5ctj\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:03 crc kubenswrapper[4755]: I0317 02:25:03.258882 4755 scope.go:117] "RemoveContainer" containerID="14fa8bfbfd55f1dee14ff15b0ac52a5f4985e432daa23fd7d8e96995faacccd7" Mar 17 02:25:03 crc kubenswrapper[4755]: I0317 02:25:03.258939 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/crc-debug-hplzd" Mar 17 02:25:04 crc kubenswrapper[4755]: I0317 02:25:04.269488 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c9cb2c-8dc4-441b-a796-feae3a2420ac" path="/var/lib/kubelet/pods/26c9cb2c-8dc4-441b-a796-feae3a2420ac/volumes" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.455815 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:27 crc kubenswrapper[4755]: E0317 02:25:27.457124 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9cb2c-8dc4-441b-a796-feae3a2420ac" containerName="container-00" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.457150 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9cb2c-8dc4-441b-a796-feae3a2420ac" containerName="container-00" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.457639 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c9cb2c-8dc4-441b-a796-feae3a2420ac" containerName="container-00" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.460641 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.494107 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.494370 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp26b\" (UniqueName: \"kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.494471 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.505461 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.598289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.598401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp26b\" (UniqueName: \"kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.598458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.598748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.598893 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.619680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp26b\" (UniqueName: \"kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b\") pod \"certified-operators-kmp6l\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:27 crc kubenswrapper[4755]: I0317 02:25:27.799471 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.339981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.588827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerStarted","Data":"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e"} Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.588878 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerStarted","Data":"d3aa29642ae667a873650b730cb5cb53c180d59e8b4ab374e49f23f4f378ed58"} Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.665385 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.665677 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.665785 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.666679 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:25:28 crc kubenswrapper[4755]: I0317 02:25:28.666827 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" gracePeriod=600 Mar 17 02:25:28 crc kubenswrapper[4755]: E0317 02:25:28.788981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.605853 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" exitCode=0 Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.605941 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943"} Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.606200 4755 scope.go:117] "RemoveContainer" containerID="46bcb3c470e5bd80eee0a58f81aa80080bfaa02fe9e8720b2e5734b89c809018" Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.608723 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.609648 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerID="85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e" exitCode=0 Mar 17 02:25:29 crc kubenswrapper[4755]: I0317 02:25:29.609721 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerDied","Data":"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e"} Mar 17 02:25:29 crc kubenswrapper[4755]: E0317 02:25:29.614256 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:25:30 crc kubenswrapper[4755]: I0317 02:25:30.621814 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerStarted","Data":"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb"} Mar 17 02:25:32 crc kubenswrapper[4755]: I0317 02:25:32.643638 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerID="1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb" exitCode=0 Mar 17 02:25:32 crc kubenswrapper[4755]: I0317 02:25:32.643699 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerDied","Data":"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb"} Mar 17 02:25:33 crc kubenswrapper[4755]: I0317 02:25:33.660625 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerStarted","Data":"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51"} Mar 17 02:25:33 crc kubenswrapper[4755]: I0317 02:25:33.679767 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kmp6l" podStartSLOduration=3.261495166 podStartE2EDuration="6.679750401s" podCreationTimestamp="2026-03-17 02:25:27 +0000 UTC" firstStartedPulling="2026-03-17 02:25:29.611669716 +0000 UTC m=+7404.371122029" lastFinishedPulling="2026-03-17 02:25:33.029924981 +0000 UTC m=+7407.789377264" observedRunningTime="2026-03-17 02:25:33.677488749 +0000 UTC m=+7408.436941032" watchObservedRunningTime="2026-03-17 02:25:33.679750401 +0000 UTC m=+7408.439202684" Mar 17 02:25:37 crc kubenswrapper[4755]: I0317 02:25:37.800150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:37 crc kubenswrapper[4755]: I0317 02:25:37.800772 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:38 crc kubenswrapper[4755]: I0317 02:25:38.847794 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kmp6l" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="registry-server" probeResult="failure" output=< Mar 17 02:25:38 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:25:38 crc kubenswrapper[4755]: > Mar 17 02:25:42 crc kubenswrapper[4755]: I0317 02:25:42.248452 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:25:42 crc kubenswrapper[4755]: E0317 02:25:42.249162 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:25:47 crc kubenswrapper[4755]: I0317 02:25:47.883712 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:47 crc kubenswrapper[4755]: I0317 02:25:47.960179 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:48 crc kubenswrapper[4755]: I0317 02:25:48.155792 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:49 crc kubenswrapper[4755]: I0317 02:25:49.869317 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kmp6l" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="registry-server" containerID="cri-o://32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51" gracePeriod=2 Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.456290 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.537415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content\") pod \"ee065345-5acb-43c0-91ad-f007f21ebc13\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.537740 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp26b\" (UniqueName: \"kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b\") pod \"ee065345-5acb-43c0-91ad-f007f21ebc13\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.537878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities\") pod \"ee065345-5acb-43c0-91ad-f007f21ebc13\" (UID: \"ee065345-5acb-43c0-91ad-f007f21ebc13\") " Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.538596 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities" (OuterVolumeSpecName: "utilities") pod "ee065345-5acb-43c0-91ad-f007f21ebc13" (UID: "ee065345-5acb-43c0-91ad-f007f21ebc13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.544726 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b" (OuterVolumeSpecName: "kube-api-access-dp26b") pod "ee065345-5acb-43c0-91ad-f007f21ebc13" (UID: "ee065345-5acb-43c0-91ad-f007f21ebc13"). InnerVolumeSpecName "kube-api-access-dp26b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.598855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee065345-5acb-43c0-91ad-f007f21ebc13" (UID: "ee065345-5acb-43c0-91ad-f007f21ebc13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.640588 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp26b\" (UniqueName: \"kubernetes.io/projected/ee065345-5acb-43c0-91ad-f007f21ebc13-kube-api-access-dp26b\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.640635 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.640650 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee065345-5acb-43c0-91ad-f007f21ebc13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.883713 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerID="32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51" exitCode=0 Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.884746 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerDied","Data":"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51"} Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.884855 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmp6l" event={"ID":"ee065345-5acb-43c0-91ad-f007f21ebc13","Type":"ContainerDied","Data":"d3aa29642ae667a873650b730cb5cb53c180d59e8b4ab374e49f23f4f378ed58"} Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.884971 4755 scope.go:117] "RemoveContainer" containerID="32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.885208 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmp6l" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.931621 4755 scope.go:117] "RemoveContainer" containerID="1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.953544 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.957726 4755 scope.go:117] "RemoveContainer" containerID="85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e" Mar 17 02:25:50 crc kubenswrapper[4755]: I0317 02:25:50.973111 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kmp6l"] Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.068362 4755 scope.go:117] "RemoveContainer" containerID="32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51" Mar 17 02:25:51 crc kubenswrapper[4755]: E0317 02:25:51.073244 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51\": container with ID starting with 32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51 not found: ID does not exist" containerID="32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51" Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.073281 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51"} err="failed to get container status \"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51\": rpc error: code = NotFound desc = could not find container \"32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51\": container with ID starting with 32d10c77da0552b7d5cfe183eae275e6c04fa8ba895e0207d0db0adb9bf74a51 not found: ID does not exist" Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.073306 4755 scope.go:117] "RemoveContainer" containerID="1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb" Mar 17 02:25:51 crc kubenswrapper[4755]: E0317 02:25:51.078255 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb\": container with ID starting with 1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb not found: ID does not exist" containerID="1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb" Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.078612 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb"} err="failed to get container status \"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb\": rpc error: code = NotFound desc = could not find container \"1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb\": container with ID starting with 1d7caa3140b7975dd76794f2fc6d3153cb65b4f6738b5594a32d0473230f10fb not found: ID does not exist" Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.078643 4755 scope.go:117] "RemoveContainer" containerID="85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e" Mar 17 02:25:51 crc kubenswrapper[4755]: E0317 02:25:51.079175 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e\": container with ID starting with 85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e not found: ID does not exist" containerID="85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e" Mar 17 02:25:51 crc kubenswrapper[4755]: I0317 02:25:51.079215 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e"} err="failed to get container status \"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e\": rpc error: code = NotFound desc = could not find container \"85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e\": container with ID starting with 85f76f5d6b747a0acf0378506189f723e489f11ff2aebb91ad62ad72f51bf86e not found: ID does not exist" Mar 17 02:25:52 crc kubenswrapper[4755]: I0317 02:25:52.260490 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" path="/var/lib/kubelet/pods/ee065345-5acb-43c0-91ad-f007f21ebc13/volumes" Mar 17 02:25:56 crc kubenswrapper[4755]: I0317 02:25:56.259976 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:25:56 crc kubenswrapper[4755]: E0317 02:25:56.268800 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.145788 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-api/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.347730 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-notifier/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.363333 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-listener/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.388934 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-evaluator/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.545659 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75cc5bb54-mqs8w_e975409f-81b6-4bcd-aec0-00f942eae3bd/barbican-api/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.616812 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75cc5bb54-mqs8w_e975409f-81b6-4bcd-aec0-00f942eae3bd/barbican-api-log/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.700076 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79df59b454-zg6ts_08712ad9-353f-4d69-aa69-87586a0b9ee3/barbican-keystone-listener/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.890302 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b59c459f5-btmbt_32b26d97-7256-4841-819f-2a2ee7ff2e3b/barbican-worker/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.899097 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79df59b454-zg6ts_08712ad9-353f-4d69-aa69-87586a0b9ee3/barbican-keystone-listener-log/0.log" Mar 17 02:25:57 crc kubenswrapper[4755]: I0317 02:25:57.992326 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b59c459f5-btmbt_32b26d97-7256-4841-819f-2a2ee7ff2e3b/barbican-worker-log/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.242211 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp_56db739c-5c0b-445c-bb95-d16d76daea1b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.299894 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/ceilometer-central-agent/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.308081 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/ceilometer-notification-agent/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.455173 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/sg-core/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.472859 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/proxy-httpd/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.531786 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b_c924cf0b-5d1b-4d21-8123-106c71d3b94b/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.679595 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9_cb099246-365d-4bd7-ad54-f765ffc586cd/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.910977 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dff3597a-93e6-4bb6-9508-c8f4609a75fc/cinder-api/0.log" Mar 17 02:25:58 crc kubenswrapper[4755]: I0317 02:25:58.927402 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dff3597a-93e6-4bb6-9508-c8f4609a75fc/cinder-api-log/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.158463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_fd360dd3-b439-453e-8543-405c8d1804b5/cinder-backup/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.169054 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_fd360dd3-b439-453e-8543-405c8d1804b5/probe/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.262999 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5201f678-3b17-4d85-b341-2f789377dbaa/cinder-scheduler/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.376615 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5201f678-3b17-4d85-b341-2f789377dbaa/probe/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.495527 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4abc0e8b-235e-48c1-8066-8958aa05a2a3/cinder-volume/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.496171 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4abc0e8b-235e-48c1-8066-8958aa05a2a3/probe/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.698201 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z_1b83398a-b089-4a14-9432-5154d7cd107c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.707150 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2_351689ec-5f29-4144-ab28-25abac57ccac/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:25:59 crc kubenswrapper[4755]: I0317 02:25:59.892361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/init/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.158372 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561906-n5qcz"] Mar 17 02:26:00 crc kubenswrapper[4755]: E0317 02:26:00.158886 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="extract-utilities" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.158914 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="extract-utilities" Mar 17 02:26:00 crc kubenswrapper[4755]: E0317 02:26:00.158922 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.158930 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4755]: E0317 02:26:00.158978 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="extract-content" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.158984 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="extract-content" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.159193 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee065345-5acb-43c0-91ad-f007f21ebc13" containerName="registry-server" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.160015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.164947 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.165029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.165378 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.172956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-n5qcz"] Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.209764 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/init/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.244717 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3953ffe-b583-483a-b3e4-8cb6393b09f7/glance-httpd/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.260653 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwdp\" (UniqueName: \"kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp\") pod \"auto-csr-approver-29561906-n5qcz\" (UID: \"69ae8f94-07ee-49f0-939f-a1355a3d9657\") " pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.264929 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/dnsmasq-dns/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.362633 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwdp\" (UniqueName: \"kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp\") pod \"auto-csr-approver-29561906-n5qcz\" (UID: \"69ae8f94-07ee-49f0-939f-a1355a3d9657\") " pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.382078 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwdp\" (UniqueName: \"kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp\") pod \"auto-csr-approver-29561906-n5qcz\" (UID: \"69ae8f94-07ee-49f0-939f-a1355a3d9657\") " pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.418164 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3953ffe-b583-483a-b3e4-8cb6393b09f7/glance-log/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.506975 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3526ee99-7b67-44b5-8cc1-0d8731e68758/glance-httpd/0.log" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.508952 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:00 crc kubenswrapper[4755]: I0317 02:26:00.548512 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3526ee99-7b67-44b5-8cc1-0d8731e68758/glance-log/0.log" Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.094349 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-n5qcz"] Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.101862 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.553700 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6dccf8ffb7-fvtwz_70e53650-f3d6-4ec4-9b49-bf34ec724c01/heat-engine/0.log" Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.658634 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d5b659cb-h7mw4_18055ce1-2e32-41f8-8985-75bda9d75b01/horizon/0.log" Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.782779 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5b57d6bfb7-dfq4n_a03991a5-be95-4757-a3d0-4ce2fff4fdf5/heat-api/0.log" Mar 17 02:26:01 crc kubenswrapper[4755]: I0317 02:26:01.915202 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz_768f1228-6ea3-4601-a0e4-93911d1d4fa1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.053642 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" event={"ID":"69ae8f94-07ee-49f0-939f-a1355a3d9657","Type":"ContainerStarted","Data":"c57ef49c9af0e964662bbc9a4f5b9a97b6cb4533c07c64639bb3136c4162c456"} Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.127828 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d5b659cb-h7mw4_18055ce1-2e32-41f8-8985-75bda9d75b01/horizon-log/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.135072 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-74f557fb5-t8sp4_59c5d35c-0a70-4965-b0b7-704028793d5e/heat-cfnapi/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.213203 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bnr2r_654af424-4add-4b0f-97a6-896204b03483/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.375807 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561821-f7mn2_fa97c87e-b133-48bb-af65-092be28ffca7/keystone-cron/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.507982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561881-fpzd5_0e541f43-cda6-4951-a0cd-f77cb49018fd/keystone-cron/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.676844 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_107b2153-2013-45e0-ad48-0f16e97d6d7e/kube-state-metrics/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.802633 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rjx85_e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.821272 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-57d7fc6d98-smgzp_b8c11156-3bb6-45fb-aea6-c00316f50ef4/keystone-api/0.log" Mar 17 02:26:02 crc kubenswrapper[4755]: I0317 02:26:02.858189 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-c5btn_7dc38b61-4933-487d-a05c-8ade6cd59270/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.062863 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" event={"ID":"69ae8f94-07ee-49f0-939f-a1355a3d9657","Type":"ContainerStarted","Data":"03e2dfe075f5cac06a02bba96ad2a3d4386fd427e8e166cc37239b92e4999570"} Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.089049 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" podStartSLOduration=2.142109705 podStartE2EDuration="3.0890226s" podCreationTimestamp="2026-03-17 02:26:00 +0000 UTC" firstStartedPulling="2026-03-17 02:26:01.100605369 +0000 UTC m=+7435.860057652" lastFinishedPulling="2026-03-17 02:26:02.047518254 +0000 UTC m=+7436.806970547" observedRunningTime="2026-03-17 02:26:03.074630292 +0000 UTC m=+7437.834082575" watchObservedRunningTime="2026-03-17 02:26:03.0890226 +0000 UTC m=+7437.848474903" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.102372 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_763b47f1-98b0-4ebc-970c-adfcac1aee29/manila-api/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.110672 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_763b47f1-98b0-4ebc-970c-adfcac1aee29/manila-api-log/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.251904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_725e1c02-2eca-44c3-8147-8976b9742412/manila-scheduler/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.301963 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_725e1c02-2eca-44c3-8147-8976b9742412/probe/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.316725 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_788939c2-92b3-482c-8271-08204a569e10/probe/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.416867 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_788939c2-92b3-482c-8271-08204a569e10/manila-share/0.log" Mar 17 02:26:03 crc kubenswrapper[4755]: I0317 02:26:03.667661 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_6fb69f1a-4500-4441-a103-843887d04772/mysqld-exporter/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.005459 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28_ed749590-0c9f-4ed1-876f-d6e28f1e98d2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.028451 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c4999fb9-bx48f_096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3/neutron-api/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.072974 4755 generic.go:334] "Generic (PLEG): container finished" podID="69ae8f94-07ee-49f0-939f-a1355a3d9657" containerID="03e2dfe075f5cac06a02bba96ad2a3d4386fd427e8e166cc37239b92e4999570" exitCode=0 Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.073034 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" event={"ID":"69ae8f94-07ee-49f0-939f-a1355a3d9657","Type":"ContainerDied","Data":"03e2dfe075f5cac06a02bba96ad2a3d4386fd427e8e166cc37239b92e4999570"} Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.090338 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c4999fb9-bx48f_096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3/neutron-httpd/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.627874 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_49f91949-f009-4181-94d6-c07e2c7cc7fc/nova-cell0-conductor-conductor/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.888260 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8/nova-cell1-conductor-conductor/0.log" Mar 17 02:26:04 crc kubenswrapper[4755]: I0317 02:26:04.979066 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c316d4cb-fdc3-45e6-b679-14a04b2b32c1/nova-api-log/0.log" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.560616 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_56f6f5b8-c52e-4fa6-be5b-12510ca9348d/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.560706 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n_d0c84d8b-60dc-4e23-a4be-83b81c52f10f/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.599621 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.773157 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtwdp\" (UniqueName: \"kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp\") pod \"69ae8f94-07ee-49f0-939f-a1355a3d9657\" (UID: \"69ae8f94-07ee-49f0-939f-a1355a3d9657\") " Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.783416 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp" (OuterVolumeSpecName: "kube-api-access-gtwdp") pod "69ae8f94-07ee-49f0-939f-a1355a3d9657" (UID: "69ae8f94-07ee-49f0-939f-a1355a3d9657"). InnerVolumeSpecName "kube-api-access-gtwdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.822604 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c316d4cb-fdc3-45e6-b679-14a04b2b32c1/nova-api-api/0.log" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.875166 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtwdp\" (UniqueName: \"kubernetes.io/projected/69ae8f94-07ee-49f0-939f-a1355a3d9657-kube-api-access-gtwdp\") on node \"crc\" DevicePath \"\"" Mar 17 02:26:05 crc kubenswrapper[4755]: I0317 02:26:05.880911 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6b2b187e-bb8e-4934-a004-532ea37d2cf2/nova-metadata-log/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.093982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" event={"ID":"69ae8f94-07ee-49f0-939f-a1355a3d9657","Type":"ContainerDied","Data":"c57ef49c9af0e964662bbc9a4f5b9a97b6cb4533c07c64639bb3136c4162c456"} Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.094019 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c57ef49c9af0e964662bbc9a4f5b9a97b6cb4533c07c64639bb3136c4162c456" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.094024 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561906-n5qcz" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.129536 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/mysql-bootstrap/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.160569 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-bn848"] Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.170510 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561900-bn848"] Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.193982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b40ded55-f4e9-48b7-b8a6-16cda16d1c09/nova-scheduler-scheduler/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.285883 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b154988e-c167-408a-afc3-f18c6088686c" path="/var/lib/kubelet/pods/b154988e-c167-408a-afc3-f18c6088686c/volumes" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.317252 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/mysql-bootstrap/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.506888 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/galera/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.563854 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/mysql-bootstrap/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.750505 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6b2b187e-bb8e-4934-a004-532ea37d2cf2/nova-metadata-metadata/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.779796 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/mysql-bootstrap/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.817444 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/galera/0.log" Mar 17 02:26:06 crc kubenswrapper[4755]: I0317 02:26:06.974335 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_007a5062-42e0-47ac-9523-a4d486614f70/openstackclient/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.064162 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dvvpc_a6eae7bd-5007-4389-b4ab-7f296d0fa9ce/ovn-controller/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.219377 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lccfn_71d7e3dc-df60-416b-add1-b7f55fd74d2d/openstack-network-exporter/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.337325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server-init/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.550216 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.558104 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovs-vswitchd/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.591585 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server-init/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.806668 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fx2nt_58680610-638b-4561-90d2-c13f1074a35b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.863943 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9c6a29f-013e-40dc-958a-05f36cb4e626/ovn-northd/0.log" Mar 17 02:26:07 crc kubenswrapper[4755]: I0317 02:26:07.867370 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9c6a29f-013e-40dc-958a-05f36cb4e626/openstack-network-exporter/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.070064 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_54c4fe64-c4f8-4e77-9029-946580816bf7/openstack-network-exporter/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.112924 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_54c4fe64-c4f8-4e77-9029-946580816bf7/ovsdbserver-nb/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.213044 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6e8532d-a845-4882-a690-09c072e39311/openstack-network-exporter/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.249743 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:26:08 crc kubenswrapper[4755]: E0317 02:26:08.250028 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.274904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6e8532d-a845-4882-a690-09c072e39311/ovsdbserver-sb/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.581149 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f66764f8d-z7959_abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a/placement-api/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.602019 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/init-config-reloader/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.630373 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f66764f8d-z7959_abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a/placement-log/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.752604 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/config-reloader/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.834107 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/init-config-reloader/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.839253 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/prometheus/0.log" Mar 17 02:26:08 crc kubenswrapper[4755]: I0317 02:26:08.850413 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/thanos-sidecar/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.041787 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/setup-container/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.292188 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/rabbitmq/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.339767 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/setup-container/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.390231 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/setup-container/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.566633 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/setup-container/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.643406 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/rabbitmq/0.log" Mar 17 02:26:09 crc kubenswrapper[4755]: I0317 02:26:09.699498 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w_522fd7b5-ad67-4bb9-815e-239ab63e78c9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.076696 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz_6f2a3043-b45c-43ea-a6fa-de300dee0390/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.200085 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dkvkg_d3df52cf-6c5b-4e10-b055-d00d52e09156/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.279235 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dbdqx_96c8e866-f764-4b94-b980-7b007ba5411c/ssh-known-hosts-edpm-deployment/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.425016 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9b5bb667-6pk7q_cfa93106-8e0c-4e7d-93cf-33d06c85d883/proxy-server/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.619212 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mh59s_9b280073-a793-4c35-a29b-d56ccf6037a7/swift-ring-rebalance/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.666792 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9b5bb667-6pk7q_cfa93106-8e0c-4e7d-93cf-33d06c85d883/proxy-httpd/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.749262 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-auditor/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.782181 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-reaper/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.944774 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-replicator/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.957027 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-server/0.log" Mar 17 02:26:10 crc kubenswrapper[4755]: I0317 02:26:10.986549 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-auditor/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.063405 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-replicator/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.154261 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-server/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.167753 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-updater/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.234684 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-auditor/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.305723 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-expirer/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.385016 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-replicator/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.405971 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-server/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.492421 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/rsync/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.499709 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-updater/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.590038 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/swift-recon-cron/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.775144 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h_dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:11 crc kubenswrapper[4755]: I0317 02:26:11.857267 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf_03684d66-3e86-4168-9a3a-62e40ba5ddce/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:12 crc kubenswrapper[4755]: I0317 02:26:12.082134 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b612dcdd-df72-4c24-827f-44d916531556/test-operator-logs-container/0.log" Mar 17 02:26:12 crc kubenswrapper[4755]: I0317 02:26:12.257311 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p_d9b4d7d9-daed-448e-b3a8-4f528207e319/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:26:12 crc kubenswrapper[4755]: I0317 02:26:12.932163 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d/tempest-tests-tempest-tests-runner/0.log" Mar 17 02:26:21 crc kubenswrapper[4755]: I0317 02:26:21.113878 4755 scope.go:117] "RemoveContainer" containerID="392a17e9dc4804241545415bcd4ee2db61b2210aeb9f26b515db978c4668df77" Mar 17 02:26:22 crc kubenswrapper[4755]: I0317 02:26:22.249625 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:26:22 crc kubenswrapper[4755]: E0317 02:26:22.250330 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:26:24 crc kubenswrapper[4755]: I0317 02:26:24.998748 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cf824bbf-6a94-4505-a9cb-67e9e394f2e1/memcached/0.log" Mar 17 02:26:36 crc kubenswrapper[4755]: I0317 02:26:36.263044 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:26:36 crc kubenswrapper[4755]: E0317 02:26:36.264175 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.013324 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.175747 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.181758 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.258534 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.380694 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.420119 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.427926 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/extract/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.626562 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-r84f9_3adfd998-aade-4343-8952-50b0eba8b510/manager/0.log" Mar 17 02:26:45 crc kubenswrapper[4755]: I0317 02:26:45.826883 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-fmtx9_00c5d701-8e74-44a0-9880-257001cb0062/manager/0.log" Mar 17 02:26:46 crc kubenswrapper[4755]: I0317 02:26:46.353447 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-x7lgv_e0809421-a91c-42c6-af2f-c8dc2ae7e856/manager/0.log" Mar 17 02:26:46 crc kubenswrapper[4755]: I0317 02:26:46.499777 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-ztp6j_13a9a76d-7b33-40eb-a7ec-5e5ff3c27705/manager/0.log" Mar 17 02:26:46 crc kubenswrapper[4755]: I0317 02:26:46.658168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-8sd9w_a67ee100-af6d-492d-9a50-40fa8c59256b/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.119321 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-zg9n4_9984519e-49f3-4af4-9c3b-d11af473a940/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.308127 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-hx685_fea3f6b7-c840-4795-8ca2-9dba15a49df1/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.322247 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-c9nzt_b5e695a0-9a52-46f0-8aae-3a4353bb3345/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.383130 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-wvrth_4fb668bf-a188-428e-b9cc-0f3ff55070fd/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.595570 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-bcm7p_a9352105-fdd9-4cf9-b073-89a6eda036ab/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.604590 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-kkr5x_ec169260-a79f-4a21-b78f-41fba2f8956e/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.747055 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-csd45_a95d67b8-819c-481e-9e68-87276454b88a/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.871100 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-8s5l4_3e3f09b9-2108-4341-9a51-6efee784ca0e/manager/0.log" Mar 17 02:26:47 crc kubenswrapper[4755]: I0317 02:26:47.974197 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-8gwqk_0f81862c-c403-445b-8030-083e914d31a7/manager/0.log" Mar 17 02:26:48 crc kubenswrapper[4755]: I0317 02:26:48.112570 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-kvfkt_c2f21978-13ea-4441-ba13-2be2beec2f0a/manager/0.log" Mar 17 02:26:48 crc kubenswrapper[4755]: I0317 02:26:48.287593 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-66cdd7cf4d-snfvp_ca5fc922-63bc-4052-844e-96e4a60e7ed4/operator/0.log" Mar 17 02:26:48 crc kubenswrapper[4755]: I0317 02:26:48.562328 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h4b7d_d22580ef-38c7-4b1a-95a3-c6a7507ba05a/registry-server/0.log" Mar 17 02:26:48 crc kubenswrapper[4755]: I0317 02:26:48.727588 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-bttcs_cea62bda-461f-4bb3-870b-51b767dd2585/manager/0.log" Mar 17 02:26:48 crc kubenswrapper[4755]: I0317 02:26:48.953516 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6pqsg_35b061e4-ec9c-46e3-828c-d787922370f9/manager/0.log" Mar 17 02:26:49 crc kubenswrapper[4755]: I0317 02:26:49.221193 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-blgfr_2d644d3f-351b-49ae-b1d6-c5ee0482ca29/operator/0.log" Mar 17 02:26:49 crc kubenswrapper[4755]: I0317 02:26:49.298026 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-4jtxc_eb7e1883-c95a-4d25-894d-be127f5d4cf3/manager/0.log" Mar 17 02:26:49 crc kubenswrapper[4755]: I0317 02:26:49.539946 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-zlwf9_5d879770-dc2d-4c14-a5e1-c80879235d96/manager/0.log" Mar 17 02:26:49 crc kubenswrapper[4755]: I0317 02:26:49.714545 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-54tw8_599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4/manager/0.log" Mar 17 02:26:49 crc kubenswrapper[4755]: I0317 02:26:49.836745 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-549b96fcbd-bklr6_f8d91ffa-2ac3-4935-95bf-45f6ac41e030/manager/0.log" Mar 17 02:26:50 crc kubenswrapper[4755]: I0317 02:26:50.006847 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d474745d9-7q6lg_eeea77af-84df-4778-8fe7-ddde0c1cda76/manager/0.log" Mar 17 02:26:50 crc kubenswrapper[4755]: I0317 02:26:50.249302 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:26:50 crc kubenswrapper[4755]: E0317 02:26:50.249554 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:02 crc kubenswrapper[4755]: I0317 02:27:02.248801 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:27:02 crc kubenswrapper[4755]: E0317 02:27:02.249814 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:11 crc kubenswrapper[4755]: I0317 02:27:11.759393 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dl89g_08e826b2-3275-49e6-b833-5494037aac5b/control-plane-machine-set-operator/0.log" Mar 17 02:27:11 crc kubenswrapper[4755]: I0317 02:27:11.908228 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8rqm6_9e12abed-0865-4f85-b563-ff72e5a05722/kube-rbac-proxy/0.log" Mar 17 02:27:11 crc kubenswrapper[4755]: I0317 02:27:11.978341 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8rqm6_9e12abed-0865-4f85-b563-ff72e5a05722/machine-api-operator/0.log" Mar 17 02:27:16 crc kubenswrapper[4755]: I0317 02:27:16.257651 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:27:16 crc kubenswrapper[4755]: E0317 02:27:16.258518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:26 crc kubenswrapper[4755]: I0317 02:27:26.312868 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7fzpr_1d430575-aa06-4c37-8262-d01a1d1766b7/cert-manager-controller/0.log" Mar 17 02:27:26 crc kubenswrapper[4755]: I0317 02:27:26.551541 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-676tx_2d8af759-0406-4291-b488-291e4db0f5ff/cert-manager-cainjector/0.log" Mar 17 02:27:26 crc kubenswrapper[4755]: I0317 02:27:26.612210 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vtlkz_6725f6d6-96db-4e53-b4b7-b14e32c3160d/cert-manager-webhook/0.log" Mar 17 02:27:27 crc kubenswrapper[4755]: I0317 02:27:27.248174 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:27:27 crc kubenswrapper[4755]: E0317 02:27:27.248765 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:40 crc kubenswrapper[4755]: I0317 02:27:40.956989 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-6f24p_33b5469d-e555-44f7-8f84-dcea89debdae/nmstate-console-plugin/0.log" Mar 17 02:27:41 crc kubenswrapper[4755]: I0317 02:27:41.175314 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvzsg_74f876f9-73dd-42eb-bc3c-8aa4e6dc854c/nmstate-handler/0.log" Mar 17 02:27:41 crc kubenswrapper[4755]: I0317 02:27:41.196035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-5ml2z_1632175c-4118-4c14-b3ef-59472c846d04/kube-rbac-proxy/0.log" Mar 17 02:27:41 crc kubenswrapper[4755]: I0317 02:27:41.278917 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-5ml2z_1632175c-4118-4c14-b3ef-59472c846d04/nmstate-metrics/0.log" Mar 17 02:27:41 crc kubenswrapper[4755]: I0317 02:27:41.376148 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-qkhgq_8905f140-6bfd-4be5-89dd-3db46bdcc933/nmstate-operator/0.log" Mar 17 02:27:41 crc kubenswrapper[4755]: I0317 02:27:41.464333 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-q892t_ae6a47ef-9bd5-4c94-8ae3-966b11c8506b/nmstate-webhook/0.log" Mar 17 02:27:42 crc kubenswrapper[4755]: I0317 02:27:42.248662 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:27:42 crc kubenswrapper[4755]: E0317 02:27:42.248930 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:53 crc kubenswrapper[4755]: I0317 02:27:53.249136 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:27:53 crc kubenswrapper[4755]: E0317 02:27:53.250176 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:27:55 crc kubenswrapper[4755]: I0317 02:27:55.248823 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/manager/0.log" Mar 17 02:27:55 crc kubenswrapper[4755]: I0317 02:27:55.263576 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/kube-rbac-proxy/0.log" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.151481 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561908-vxrxn"] Mar 17 02:28:00 crc kubenswrapper[4755]: E0317 02:28:00.152580 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ae8f94-07ee-49f0-939f-a1355a3d9657" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.152600 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ae8f94-07ee-49f0-939f-a1355a3d9657" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.152862 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ae8f94-07ee-49f0-939f-a1355a3d9657" containerName="oc" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.153785 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.166974 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-vxrxn"] Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.200364 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.200625 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.200780 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.304491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2w6\" (UniqueName: \"kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6\") pod \"auto-csr-approver-29561908-vxrxn\" (UID: \"300e3839-ff11-45c8-b162-6c3646fa4173\") " pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.407045 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2w6\" (UniqueName: \"kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6\") pod \"auto-csr-approver-29561908-vxrxn\" (UID: \"300e3839-ff11-45c8-b162-6c3646fa4173\") " pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.433039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2w6\" (UniqueName: \"kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6\") pod \"auto-csr-approver-29561908-vxrxn\" (UID: \"300e3839-ff11-45c8-b162-6c3646fa4173\") " pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:00 crc kubenswrapper[4755]: I0317 02:28:00.517291 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:01 crc kubenswrapper[4755]: I0317 02:28:01.024240 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-vxrxn"] Mar 17 02:28:01 crc kubenswrapper[4755]: W0317 02:28:01.024877 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod300e3839_ff11_45c8_b162_6c3646fa4173.slice/crio-0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5 WatchSource:0}: Error finding container 0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5: Status 404 returned error can't find the container with id 0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5 Mar 17 02:28:01 crc kubenswrapper[4755]: I0317 02:28:01.399121 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" event={"ID":"300e3839-ff11-45c8-b162-6c3646fa4173","Type":"ContainerStarted","Data":"0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5"} Mar 17 02:28:02 crc kubenswrapper[4755]: I0317 02:28:02.410317 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" event={"ID":"300e3839-ff11-45c8-b162-6c3646fa4173","Type":"ContainerStarted","Data":"61da04e9e673b3c22b25ad8b4a49dd934213af28cf63d0d988a547b19b8951c0"} Mar 17 02:28:02 crc kubenswrapper[4755]: I0317 02:28:02.431238 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" podStartSLOduration=1.381777048 podStartE2EDuration="2.431219849s" podCreationTimestamp="2026-03-17 02:28:00 +0000 UTC" firstStartedPulling="2026-03-17 02:28:01.026950636 +0000 UTC m=+7555.786402939" lastFinishedPulling="2026-03-17 02:28:02.076393417 +0000 UTC m=+7556.835845740" observedRunningTime="2026-03-17 02:28:02.422255087 +0000 UTC m=+7557.181707370" watchObservedRunningTime="2026-03-17 02:28:02.431219849 +0000 UTC m=+7557.190672132" Mar 17 02:28:03 crc kubenswrapper[4755]: I0317 02:28:03.423582 4755 generic.go:334] "Generic (PLEG): container finished" podID="300e3839-ff11-45c8-b162-6c3646fa4173" containerID="61da04e9e673b3c22b25ad8b4a49dd934213af28cf63d0d988a547b19b8951c0" exitCode=0 Mar 17 02:28:03 crc kubenswrapper[4755]: I0317 02:28:03.423815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" event={"ID":"300e3839-ff11-45c8-b162-6c3646fa4173","Type":"ContainerDied","Data":"61da04e9e673b3c22b25ad8b4a49dd934213af28cf63d0d988a547b19b8951c0"} Mar 17 02:28:04 crc kubenswrapper[4755]: I0317 02:28:04.926934 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.012481 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n2w6\" (UniqueName: \"kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6\") pod \"300e3839-ff11-45c8-b162-6c3646fa4173\" (UID: \"300e3839-ff11-45c8-b162-6c3646fa4173\") " Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.018888 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6" (OuterVolumeSpecName: "kube-api-access-4n2w6") pod "300e3839-ff11-45c8-b162-6c3646fa4173" (UID: "300e3839-ff11-45c8-b162-6c3646fa4173"). InnerVolumeSpecName "kube-api-access-4n2w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.116212 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n2w6\" (UniqueName: \"kubernetes.io/projected/300e3839-ff11-45c8-b162-6c3646fa4173-kube-api-access-4n2w6\") on node \"crc\" DevicePath \"\"" Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.453066 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" event={"ID":"300e3839-ff11-45c8-b162-6c3646fa4173","Type":"ContainerDied","Data":"0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5"} Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.453108 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecc80444c13d322c6ab3ffb6c78277212908db232d261c976579593af00e2c5" Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.454184 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561908-vxrxn" Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.510365 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-5t5jh"] Mar 17 02:28:05 crc kubenswrapper[4755]: I0317 02:28:05.525650 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561902-5t5jh"] Mar 17 02:28:06 crc kubenswrapper[4755]: I0317 02:28:06.274723 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4733fbe-af6f-4e6c-89ea-afef79ac1ad0" path="/var/lib/kubelet/pods/e4733fbe-af6f-4e6c-89ea-afef79ac1ad0/volumes" Mar 17 02:28:08 crc kubenswrapper[4755]: I0317 02:28:08.249110 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:28:08 crc kubenswrapper[4755]: E0317 02:28:08.249726 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.251904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-bvjlr_26acf5e2-72ee-4d4c-b25b-9d641f0a42df/prometheus-operator/0.log" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.468983 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_9d5530d5-e196-42d5-b0b9-c089b13d97a8/prometheus-operator-admission-webhook/0.log" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.537635 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1/prometheus-operator-admission-webhook/0.log" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.678610 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4w5t6_24b6289e-88b6-4958-9ce1-539cecddbd1f/operator/0.log" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.756542 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-2vdc8_96eaea54-65d7-475c-8d91-45ba95bd547a/observability-ui-dashboards/0.log" Mar 17 02:28:10 crc kubenswrapper[4755]: I0317 02:28:10.848673 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-czwsf_70c01555-4d7f-426f-a9a5-fd21462252dc/perses-operator/0.log" Mar 17 02:28:21 crc kubenswrapper[4755]: I0317 02:28:21.318166 4755 scope.go:117] "RemoveContainer" containerID="da4d76c7154954e1faee31e6ea0b1a9a080ad2557ab5e0f2a201dc64e077d522" Mar 17 02:28:22 crc kubenswrapper[4755]: I0317 02:28:22.248892 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:28:22 crc kubenswrapper[4755]: E0317 02:28:22.249927 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.444264 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-vb7pt_c40a3b37-e723-4031-be06-728785655b37/cluster-logging-operator/0.log" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.644812 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-tqf58_50d6f059-2e1c-4ac4-9952-dcbab62b23db/collector/0.log" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.709534 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_66514db5-2205-445e-b424-b55fb9910be3/loki-compactor/0.log" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.824398 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-crzs4_c413d841-c2b9-4757-bbe4-ebd965553d29/loki-distributor/0.log" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.940912 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-f2g7b_ad7de1cc-717a-4e3e-81f7-43c677c2db13/gateway/0.log" Mar 17 02:28:28 crc kubenswrapper[4755]: I0317 02:28:28.960132 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-f2g7b_ad7de1cc-717a-4e3e-81f7-43c677c2db13/opa/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.089894 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-jfm48_c1fe9206-5f28-4707-b175-12ba0fadb400/gateway/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.123400 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-jfm48_c1fe9206-5f28-4707-b175-12ba0fadb400/opa/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.211598 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_93847e12-81c9-4ae4-8090-e7df4bd5f9a7/loki-index-gateway/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.397281 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c516523b-4c3b-4083-a8f5-18c9061c7032/loki-ingester/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.423056 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-bh66s_85ac7711-fd0b-4598-93dd-6c591a532bac/loki-querier/0.log" Mar 17 02:28:29 crc kubenswrapper[4755]: I0317 02:28:29.575373 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-95f86_296e81ca-7bf1-44f2-b1a8-bfb13a563134/loki-query-frontend/0.log" Mar 17 02:28:35 crc kubenswrapper[4755]: I0317 02:28:35.247888 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:28:35 crc kubenswrapper[4755]: E0317 02:28:35.248682 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:28:37 crc kubenswrapper[4755]: I0317 02:28:37.658398 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.434826 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hfls9_c2bae47d-2436-490a-8998-6d1f1c59ff6d/kube-rbac-proxy/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.564728 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hfls9_c2bae47d-2436-490a-8998-6d1f1c59ff6d/controller/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.722544 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.878651 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.935016 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.945342 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:28:46 crc kubenswrapper[4755]: I0317 02:28:46.948111 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.116307 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.142131 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.157195 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.195729 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.248990 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:28:47 crc kubenswrapper[4755]: E0317 02:28:47.249393 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.352695 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.384998 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.390744 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.401895 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/controller/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.547563 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/frr-metrics/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.607325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/kube-rbac-proxy-frr/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.652507 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/kube-rbac-proxy/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.775809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/reloader/0.log" Mar 17 02:28:47 crc kubenswrapper[4755]: I0317 02:28:47.891616 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gv4fl_7a47b75a-e6b5-493f-9ec6-8843b2724a32/frr-k8s-webhook-server/0.log" Mar 17 02:28:48 crc kubenswrapper[4755]: I0317 02:28:48.111284 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7fc56c47d5-cmtzx_4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9/manager/0.log" Mar 17 02:28:48 crc kubenswrapper[4755]: I0317 02:28:48.236565 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-578dbc6b-286wk_356ac706-4ec4-49b8-b270-6c8fa35b7d72/webhook-server/0.log" Mar 17 02:28:48 crc kubenswrapper[4755]: I0317 02:28:48.423639 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ct56p_07bf4fdd-648b-425f-8b00-7ad303c2b77f/kube-rbac-proxy/0.log" Mar 17 02:28:48 crc kubenswrapper[4755]: I0317 02:28:48.949719 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ct56p_07bf4fdd-648b-425f-8b00-7ad303c2b77f/speaker/0.log" Mar 17 02:28:50 crc kubenswrapper[4755]: I0317 02:28:50.001390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/frr/0.log" Mar 17 02:28:59 crc kubenswrapper[4755]: I0317 02:28:59.248737 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:28:59 crc kubenswrapper[4755]: E0317 02:28:59.249347 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:29:04 crc kubenswrapper[4755]: I0317 02:29:04.614108 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:29:04 crc kubenswrapper[4755]: I0317 02:29:04.929332 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:29:04 crc kubenswrapper[4755]: I0317 02:29:04.929677 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:29:04 crc kubenswrapper[4755]: I0317 02:29:04.936549 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.104641 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.112614 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.154473 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/extract/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.258504 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.502218 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.503860 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.537425 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.660462 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.672683 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.733430 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/extract/0.log" Mar 17 02:29:05 crc kubenswrapper[4755]: I0317 02:29:05.861578 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.065734 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.084883 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.104175 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.275689 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.276367 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.295930 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/extract/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.475466 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.680981 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.699264 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.703035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:29:06 crc kubenswrapper[4755]: I0317 02:29:06.966476 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/extract/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.057053 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.098665 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.211941 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.366606 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.375401 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.461233 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.843939 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.916851 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:29:07 crc kubenswrapper[4755]: I0317 02:29:07.921464 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/extract/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.055105 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.218055 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.257754 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.280665 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.514085 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.589736 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.593146 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.819999 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/registry-server/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.822555 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.847212 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:29:08 crc kubenswrapper[4755]: I0317 02:29:08.874520 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.064283 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.073846 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mvsj8_277ca8b8-67f5-4fdb-ad34-648ad653fa5d/marketplace-operator/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.105067 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.325028 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.329430 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/registry-server/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.503605 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.530407 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.530407 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.710465 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.744334 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.805974 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:29:09 crc kubenswrapper[4755]: I0317 02:29:09.961607 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/registry-server/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.036404 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.041919 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.067279 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.227258 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.252237 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:29:10 crc kubenswrapper[4755]: I0317 02:29:10.754498 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/registry-server/0.log" Mar 17 02:29:11 crc kubenswrapper[4755]: I0317 02:29:11.248534 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:29:11 crc kubenswrapper[4755]: E0317 02:29:11.249105 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:29:22 crc kubenswrapper[4755]: I0317 02:29:22.249353 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:29:22 crc kubenswrapper[4755]: E0317 02:29:22.250539 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:29:24 crc kubenswrapper[4755]: I0317 02:29:24.701876 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-bvjlr_26acf5e2-72ee-4d4c-b25b-9d641f0a42df/prometheus-operator/0.log" Mar 17 02:29:24 crc kubenswrapper[4755]: I0317 02:29:24.752950 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1/prometheus-operator-admission-webhook/0.log" Mar 17 02:29:24 crc kubenswrapper[4755]: I0317 02:29:24.753581 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_9d5530d5-e196-42d5-b0b9-c089b13d97a8/prometheus-operator-admission-webhook/0.log" Mar 17 02:29:24 crc kubenswrapper[4755]: I0317 02:29:24.901375 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4w5t6_24b6289e-88b6-4958-9ce1-539cecddbd1f/operator/0.log" Mar 17 02:29:25 crc kubenswrapper[4755]: I0317 02:29:25.096878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-2vdc8_96eaea54-65d7-475c-8d91-45ba95bd547a/observability-ui-dashboards/0.log" Mar 17 02:29:25 crc kubenswrapper[4755]: I0317 02:29:25.156413 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-czwsf_70c01555-4d7f-426f-a9a5-fd21462252dc/perses-operator/0.log" Mar 17 02:29:36 crc kubenswrapper[4755]: I0317 02:29:36.257289 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:29:36 crc kubenswrapper[4755]: E0317 02:29:36.258231 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:29:40 crc kubenswrapper[4755]: I0317 02:29:40.877127 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/kube-rbac-proxy/0.log" Mar 17 02:29:40 crc kubenswrapper[4755]: I0317 02:29:40.953186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/manager/0.log" Mar 17 02:29:50 crc kubenswrapper[4755]: I0317 02:29:50.249137 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:29:50 crc kubenswrapper[4755]: E0317 02:29:50.249860 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:29:52 crc kubenswrapper[4755]: I0317 02:29:52.875978 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:29:52 crc kubenswrapper[4755]: E0317 02:29:52.877455 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300e3839-ff11-45c8-b162-6c3646fa4173" containerName="oc" Mar 17 02:29:52 crc kubenswrapper[4755]: I0317 02:29:52.877470 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="300e3839-ff11-45c8-b162-6c3646fa4173" containerName="oc" Mar 17 02:29:52 crc kubenswrapper[4755]: I0317 02:29:52.877730 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="300e3839-ff11-45c8-b162-6c3646fa4173" containerName="oc" Mar 17 02:29:52 crc kubenswrapper[4755]: I0317 02:29:52.879473 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:52 crc kubenswrapper[4755]: I0317 02:29:52.925608 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.006430 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.006608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfpz\" (UniqueName: \"kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.006634 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.107997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.108142 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfpz\" (UniqueName: \"kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.108166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.108986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.110579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.153690 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfpz\" (UniqueName: \"kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz\") pod \"redhat-marketplace-jkkqn\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:53 crc kubenswrapper[4755]: I0317 02:29:53.215338 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:29:54 crc kubenswrapper[4755]: I0317 02:29:54.363324 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:29:54 crc kubenswrapper[4755]: I0317 02:29:54.616401 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerStarted","Data":"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36"} Mar 17 02:29:54 crc kubenswrapper[4755]: I0317 02:29:54.616620 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerStarted","Data":"7520be0a3a3e9f8bb0d0db54614b44b4d82a39183cef1712487ac57b6c98abaa"} Mar 17 02:29:55 crc kubenswrapper[4755]: I0317 02:29:55.626788 4755 generic.go:334] "Generic (PLEG): container finished" podID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerID="9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36" exitCode=0 Mar 17 02:29:55 crc kubenswrapper[4755]: I0317 02:29:55.626827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerDied","Data":"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36"} Mar 17 02:29:56 crc kubenswrapper[4755]: I0317 02:29:56.637632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerStarted","Data":"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71"} Mar 17 02:29:57 crc kubenswrapper[4755]: I0317 02:29:57.649107 4755 generic.go:334] "Generic (PLEG): container finished" podID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerID="ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71" exitCode=0 Mar 17 02:29:57 crc kubenswrapper[4755]: I0317 02:29:57.649847 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerDied","Data":"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71"} Mar 17 02:29:58 crc kubenswrapper[4755]: I0317 02:29:58.661927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerStarted","Data":"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0"} Mar 17 02:29:58 crc kubenswrapper[4755]: I0317 02:29:58.716301 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jkkqn" podStartSLOduration=4.30692025 podStartE2EDuration="6.716281537s" podCreationTimestamp="2026-03-17 02:29:52 +0000 UTC" firstStartedPulling="2026-03-17 02:29:55.629474485 +0000 UTC m=+7670.388926768" lastFinishedPulling="2026-03-17 02:29:58.038835782 +0000 UTC m=+7672.798288055" observedRunningTime="2026-03-17 02:29:58.708527548 +0000 UTC m=+7673.467979831" watchObservedRunningTime="2026-03-17 02:29:58.716281537 +0000 UTC m=+7673.475733820" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.151112 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561910-ngp4q"] Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.152665 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.155182 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.155250 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.156812 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.163268 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-ngp4q"] Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.245482 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72"] Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.248168 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.250844 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.251020 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.259418 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72"] Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.261604 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp55\" (UniqueName: \"kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55\") pod \"auto-csr-approver-29561910-ngp4q\" (UID: \"98384819-daff-4e77-9f4b-ad906cda49e9\") " pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.363672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm789\" (UniqueName: \"kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.364009 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp55\" (UniqueName: \"kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55\") pod \"auto-csr-approver-29561910-ngp4q\" (UID: \"98384819-daff-4e77-9f4b-ad906cda49e9\") " pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.364230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.364268 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.385369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp55\" (UniqueName: \"kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55\") pod \"auto-csr-approver-29561910-ngp4q\" (UID: \"98384819-daff-4e77-9f4b-ad906cda49e9\") " pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.466289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm789\" (UniqueName: \"kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.466494 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.466516 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.467336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.468911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.471193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.482301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm789\" (UniqueName: \"kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789\") pod \"collect-profiles-29561910-cnj72\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:00 crc kubenswrapper[4755]: I0317 02:30:00.568498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:01 crc kubenswrapper[4755]: I0317 02:30:01.040065 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72"] Mar 17 02:30:01 crc kubenswrapper[4755]: W0317 02:30:01.040296 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac7ea7d_955f_413f_8733_b6d88c1dfbc4.slice/crio-c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2 WatchSource:0}: Error finding container c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2: Status 404 returned error can't find the container with id c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2 Mar 17 02:30:01 crc kubenswrapper[4755]: W0317 02:30:01.043503 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98384819_daff_4e77_9f4b_ad906cda49e9.slice/crio-07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d WatchSource:0}: Error finding container 07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d: Status 404 returned error can't find the container with id 07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d Mar 17 02:30:01 crc kubenswrapper[4755]: I0317 02:30:01.051024 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-ngp4q"] Mar 17 02:30:01 crc kubenswrapper[4755]: I0317 02:30:01.692990 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" event={"ID":"98384819-daff-4e77-9f4b-ad906cda49e9","Type":"ContainerStarted","Data":"07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d"} Mar 17 02:30:01 crc kubenswrapper[4755]: I0317 02:30:01.694860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" event={"ID":"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4","Type":"ContainerStarted","Data":"4cc3c99dd8ba67551d440d6154f1cab73c456172bb4d9ded2025392283ac6967"} Mar 17 02:30:01 crc kubenswrapper[4755]: I0317 02:30:01.694884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" event={"ID":"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4","Type":"ContainerStarted","Data":"c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2"} Mar 17 02:30:02 crc kubenswrapper[4755]: I0317 02:30:02.706364 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" containerID="4cc3c99dd8ba67551d440d6154f1cab73c456172bb4d9ded2025392283ac6967" exitCode=0 Mar 17 02:30:02 crc kubenswrapper[4755]: I0317 02:30:02.706428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" event={"ID":"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4","Type":"ContainerDied","Data":"4cc3c99dd8ba67551d440d6154f1cab73c456172bb4d9ded2025392283ac6967"} Mar 17 02:30:03 crc kubenswrapper[4755]: I0317 02:30:03.216721 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:03 crc kubenswrapper[4755]: I0317 02:30:03.216771 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.181243 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.255031 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:30:04 crc kubenswrapper[4755]: E0317 02:30:04.255293 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.269299 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jkkqn" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="registry-server" probeResult="failure" output=< Mar 17 02:30:04 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:30:04 crc kubenswrapper[4755]: > Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.345974 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm789\" (UniqueName: \"kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789\") pod \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.346058 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume\") pod \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.346220 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume\") pod \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\" (UID: \"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4\") " Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.348425 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" (UID: "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.384006 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" (UID: "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.384286 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789" (OuterVolumeSpecName: "kube-api-access-vm789") pod "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" (UID: "2ac7ea7d-955f-413f-8733-b6d88c1dfbc4"). InnerVolumeSpecName "kube-api-access-vm789". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.448851 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm789\" (UniqueName: \"kubernetes.io/projected/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-kube-api-access-vm789\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.448893 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.448910 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ac7ea7d-955f-413f-8733-b6d88c1dfbc4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.727106 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" event={"ID":"2ac7ea7d-955f-413f-8733-b6d88c1dfbc4","Type":"ContainerDied","Data":"c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2"} Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.727595 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f7a04ceb530a2981c321f0628abf6e50b292373195430beb0bbb63872049f2" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.728348 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561910-cnj72" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.729188 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" event={"ID":"98384819-daff-4e77-9f4b-ad906cda49e9","Type":"ContainerStarted","Data":"74804751037ee643935787dddf6bdff22015ec09987b66ec1819d859a1eb860b"} Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.793105 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" podStartSLOduration=2.991675444 podStartE2EDuration="4.793084531s" podCreationTimestamp="2026-03-17 02:30:00 +0000 UTC" firstStartedPulling="2026-03-17 02:30:01.060580959 +0000 UTC m=+7675.820033242" lastFinishedPulling="2026-03-17 02:30:02.861990046 +0000 UTC m=+7677.621442329" observedRunningTime="2026-03-17 02:30:04.747788729 +0000 UTC m=+7679.507241002" watchObservedRunningTime="2026-03-17 02:30:04.793084531 +0000 UTC m=+7679.552536814" Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.808097 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59"] Mar 17 02:30:04 crc kubenswrapper[4755]: I0317 02:30:04.817949 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561865-mqc59"] Mar 17 02:30:05 crc kubenswrapper[4755]: I0317 02:30:05.739345 4755 generic.go:334] "Generic (PLEG): container finished" podID="98384819-daff-4e77-9f4b-ad906cda49e9" containerID="74804751037ee643935787dddf6bdff22015ec09987b66ec1819d859a1eb860b" exitCode=0 Mar 17 02:30:05 crc kubenswrapper[4755]: I0317 02:30:05.739383 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" event={"ID":"98384819-daff-4e77-9f4b-ad906cda49e9","Type":"ContainerDied","Data":"74804751037ee643935787dddf6bdff22015ec09987b66ec1819d859a1eb860b"} Mar 17 02:30:06 crc kubenswrapper[4755]: I0317 02:30:06.270952 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a87505-6ac8-497a-bae1-f8f2664446e7" path="/var/lib/kubelet/pods/d6a87505-6ac8-497a-bae1-f8f2664446e7/volumes" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.217568 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.331359 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qp55\" (UniqueName: \"kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55\") pod \"98384819-daff-4e77-9f4b-ad906cda49e9\" (UID: \"98384819-daff-4e77-9f4b-ad906cda49e9\") " Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.336631 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55" (OuterVolumeSpecName: "kube-api-access-4qp55") pod "98384819-daff-4e77-9f4b-ad906cda49e9" (UID: "98384819-daff-4e77-9f4b-ad906cda49e9"). InnerVolumeSpecName "kube-api-access-4qp55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.434129 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qp55\" (UniqueName: \"kubernetes.io/projected/98384819-daff-4e77-9f4b-ad906cda49e9-kube-api-access-4qp55\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.762895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" event={"ID":"98384819-daff-4e77-9f4b-ad906cda49e9","Type":"ContainerDied","Data":"07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d"} Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.762958 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07c596a2c426035855b46d842641368eb703e740c9bb1186e9a64723fa78182d" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.762970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561910-ngp4q" Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.828482 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-859gk"] Mar 17 02:30:07 crc kubenswrapper[4755]: I0317 02:30:07.850288 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561904-859gk"] Mar 17 02:30:08 crc kubenswrapper[4755]: I0317 02:30:08.266346 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a322944-18dc-41dc-9b4b-115d30dd0fc7" path="/var/lib/kubelet/pods/3a322944-18dc-41dc-9b4b-115d30dd0fc7/volumes" Mar 17 02:30:13 crc kubenswrapper[4755]: I0317 02:30:13.310275 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:13 crc kubenswrapper[4755]: I0317 02:30:13.411309 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:13 crc kubenswrapper[4755]: I0317 02:30:13.573474 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:30:14 crc kubenswrapper[4755]: I0317 02:30:14.853924 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jkkqn" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="registry-server" containerID="cri-o://29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0" gracePeriod=2 Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.248635 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:30:15 crc kubenswrapper[4755]: E0317 02:30:15.249586 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.529967 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.632540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities\") pod \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.632900 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content\") pod \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.633103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpfpz\" (UniqueName: \"kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz\") pod \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\" (UID: \"1cd9be9b-474a-4646-9996-b4f5ad8b4f69\") " Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.633610 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities" (OuterVolumeSpecName: "utilities") pod "1cd9be9b-474a-4646-9996-b4f5ad8b4f69" (UID: "1cd9be9b-474a-4646-9996-b4f5ad8b4f69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.635027 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.641296 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz" (OuterVolumeSpecName: "kube-api-access-mpfpz") pod "1cd9be9b-474a-4646-9996-b4f5ad8b4f69" (UID: "1cd9be9b-474a-4646-9996-b4f5ad8b4f69"). InnerVolumeSpecName "kube-api-access-mpfpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.670652 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd9be9b-474a-4646-9996-b4f5ad8b4f69" (UID: "1cd9be9b-474a-4646-9996-b4f5ad8b4f69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.737411 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.737493 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpfpz\" (UniqueName: \"kubernetes.io/projected/1cd9be9b-474a-4646-9996-b4f5ad8b4f69-kube-api-access-mpfpz\") on node \"crc\" DevicePath \"\"" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.873266 4755 generic.go:334] "Generic (PLEG): container finished" podID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerID="29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0" exitCode=0 Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.873381 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkkqn" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.874393 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerDied","Data":"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0"} Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.874546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkkqn" event={"ID":"1cd9be9b-474a-4646-9996-b4f5ad8b4f69","Type":"ContainerDied","Data":"7520be0a3a3e9f8bb0d0db54614b44b4d82a39183cef1712487ac57b6c98abaa"} Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.874582 4755 scope.go:117] "RemoveContainer" containerID="29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.928166 4755 scope.go:117] "RemoveContainer" containerID="ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.943636 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.957175 4755 scope.go:117] "RemoveContainer" containerID="9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36" Mar 17 02:30:15 crc kubenswrapper[4755]: I0317 02:30:15.959977 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkkqn"] Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.036995 4755 scope.go:117] "RemoveContainer" containerID="29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0" Mar 17 02:30:16 crc kubenswrapper[4755]: E0317 02:30:16.039889 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0\": container with ID starting with 29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0 not found: ID does not exist" containerID="29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.039932 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0"} err="failed to get container status \"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0\": rpc error: code = NotFound desc = could not find container \"29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0\": container with ID starting with 29e9c707505ae38cff1d2cc63ded055b3217a78366e1d70fb2986c24ee1df8e0 not found: ID does not exist" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.039957 4755 scope.go:117] "RemoveContainer" containerID="ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71" Mar 17 02:30:16 crc kubenswrapper[4755]: E0317 02:30:16.040561 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71\": container with ID starting with ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71 not found: ID does not exist" containerID="ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.040640 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71"} err="failed to get container status \"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71\": rpc error: code = NotFound desc = could not find container \"ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71\": container with ID starting with ff77450654c3a0da1561967e95519d287bd4df564e314b0ef960521265600d71 not found: ID does not exist" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.040694 4755 scope.go:117] "RemoveContainer" containerID="9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36" Mar 17 02:30:16 crc kubenswrapper[4755]: E0317 02:30:16.041354 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36\": container with ID starting with 9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36 not found: ID does not exist" containerID="9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.041383 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36"} err="failed to get container status \"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36\": rpc error: code = NotFound desc = could not find container \"9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36\": container with ID starting with 9947217cc189040592a94ea0e951f2a0475b37b5eed8ca73b07654ef5ed97a36 not found: ID does not exist" Mar 17 02:30:16 crc kubenswrapper[4755]: I0317 02:30:16.266670 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" path="/var/lib/kubelet/pods/1cd9be9b-474a-4646-9996-b4f5ad8b4f69/volumes" Mar 17 02:30:21 crc kubenswrapper[4755]: I0317 02:30:21.497680 4755 scope.go:117] "RemoveContainer" containerID="4a6d4d43229f1d3661f835d3d5494da2d6f45954775efa3466491f7663e478e2" Mar 17 02:30:21 crc kubenswrapper[4755]: I0317 02:30:21.584256 4755 scope.go:117] "RemoveContainer" containerID="675be24cd45c9b39d84cf1f2208e6d4aa7f81eead3f2550b9d32a08f6247f736" Mar 17 02:30:21 crc kubenswrapper[4755]: I0317 02:30:21.611253 4755 scope.go:117] "RemoveContainer" containerID="f86872095970fd4975c81cd671468253c1c7665556cfe904f349ed7ee92c6325" Mar 17 02:30:26 crc kubenswrapper[4755]: I0317 02:30:26.270757 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:30:26 crc kubenswrapper[4755]: E0317 02:30:26.272231 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:30:37 crc kubenswrapper[4755]: I0317 02:30:37.251892 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:30:38 crc kubenswrapper[4755]: I0317 02:30:38.157613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548"} Mar 17 02:31:21 crc kubenswrapper[4755]: I0317 02:31:21.809717 4755 scope.go:117] "RemoveContainer" containerID="ba6407a05d31d2cbd1a93d73e4cedfae291b24bbb44cfe315d33fba42d8ade6b" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.986185 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:26 crc kubenswrapper[4755]: E0317 02:31:26.987842 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="registry-server" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.987863 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="registry-server" Mar 17 02:31:26 crc kubenswrapper[4755]: E0317 02:31:26.987910 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" containerName="collect-profiles" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.987921 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" containerName="collect-profiles" Mar 17 02:31:26 crc kubenswrapper[4755]: E0317 02:31:26.987933 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="extract-content" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.987944 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="extract-content" Mar 17 02:31:26 crc kubenswrapper[4755]: E0317 02:31:26.987971 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98384819-daff-4e77-9f4b-ad906cda49e9" containerName="oc" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.987981 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="98384819-daff-4e77-9f4b-ad906cda49e9" containerName="oc" Mar 17 02:31:26 crc kubenswrapper[4755]: E0317 02:31:26.988008 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="extract-utilities" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.988019 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="extract-utilities" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.988327 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac7ea7d-955f-413f-8733-b6d88c1dfbc4" containerName="collect-profiles" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.988372 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="98384819-daff-4e77-9f4b-ad906cda49e9" containerName="oc" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.988386 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd9be9b-474a-4646-9996-b4f5ad8b4f69" containerName="registry-server" Mar 17 02:31:26 crc kubenswrapper[4755]: I0317 02:31:26.991219 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.005293 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.124269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.124404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.124426 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg85k\" (UniqueName: \"kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.227607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.227661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg85k\" (UniqueName: \"kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.227838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.240702 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.240755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.261963 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg85k\" (UniqueName: \"kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k\") pod \"community-operators-vxttz\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:27 crc kubenswrapper[4755]: I0317 02:31:27.326918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:28 crc kubenswrapper[4755]: I0317 02:31:28.009057 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:28 crc kubenswrapper[4755]: I0317 02:31:28.896100 4755 generic.go:334] "Generic (PLEG): container finished" podID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerID="72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811" exitCode=0 Mar 17 02:31:28 crc kubenswrapper[4755]: I0317 02:31:28.896411 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerDied","Data":"72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811"} Mar 17 02:31:28 crc kubenswrapper[4755]: I0317 02:31:28.896478 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerStarted","Data":"d9b9133d45bc9cc6bafd7eaebbe65efd916af4c2accf8d126afc393a5e46aee8"} Mar 17 02:31:28 crc kubenswrapper[4755]: I0317 02:31:28.900915 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:31:29 crc kubenswrapper[4755]: I0317 02:31:29.913989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerStarted","Data":"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6"} Mar 17 02:31:31 crc kubenswrapper[4755]: I0317 02:31:31.936827 4755 generic.go:334] "Generic (PLEG): container finished" podID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerID="7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6" exitCode=0 Mar 17 02:31:31 crc kubenswrapper[4755]: I0317 02:31:31.936926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerDied","Data":"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6"} Mar 17 02:31:32 crc kubenswrapper[4755]: I0317 02:31:32.948900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerStarted","Data":"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd"} Mar 17 02:31:32 crc kubenswrapper[4755]: I0317 02:31:32.977278 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxttz" podStartSLOduration=3.520892062 podStartE2EDuration="6.977257423s" podCreationTimestamp="2026-03-17 02:31:26 +0000 UTC" firstStartedPulling="2026-03-17 02:31:28.898983885 +0000 UTC m=+7763.658436208" lastFinishedPulling="2026-03-17 02:31:32.355349286 +0000 UTC m=+7767.114801569" observedRunningTime="2026-03-17 02:31:32.972606948 +0000 UTC m=+7767.732059251" watchObservedRunningTime="2026-03-17 02:31:32.977257423 +0000 UTC m=+7767.736709726" Mar 17 02:31:37 crc kubenswrapper[4755]: I0317 02:31:37.328089 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:37 crc kubenswrapper[4755]: I0317 02:31:37.329736 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:37 crc kubenswrapper[4755]: I0317 02:31:37.400791 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:38 crc kubenswrapper[4755]: I0317 02:31:38.072178 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:38 crc kubenswrapper[4755]: I0317 02:31:38.133906 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.036221 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxttz" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="registry-server" containerID="cri-o://ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd" gracePeriod=2 Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.612284 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.767768 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg85k\" (UniqueName: \"kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k\") pod \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.768076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities\") pod \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.768209 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content\") pod \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\" (UID: \"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc\") " Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.768858 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities" (OuterVolumeSpecName: "utilities") pod "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" (UID: "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.776653 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k" (OuterVolumeSpecName: "kube-api-access-dg85k") pod "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" (UID: "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc"). InnerVolumeSpecName "kube-api-access-dg85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.817925 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" (UID: "dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.871223 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg85k\" (UniqueName: \"kubernetes.io/projected/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-kube-api-access-dg85k\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.871264 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:40 crc kubenswrapper[4755]: I0317 02:31:40.871296 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.052286 4755 generic.go:334] "Generic (PLEG): container finished" podID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerID="ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd" exitCode=0 Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.052339 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerDied","Data":"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd"} Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.052636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxttz" event={"ID":"dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc","Type":"ContainerDied","Data":"d9b9133d45bc9cc6bafd7eaebbe65efd916af4c2accf8d126afc393a5e46aee8"} Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.052658 4755 scope.go:117] "RemoveContainer" containerID="ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.052382 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxttz" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.105339 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.111721 4755 scope.go:117] "RemoveContainer" containerID="7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.119465 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxttz"] Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.152886 4755 scope.go:117] "RemoveContainer" containerID="72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.212614 4755 scope.go:117] "RemoveContainer" containerID="ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd" Mar 17 02:31:41 crc kubenswrapper[4755]: E0317 02:31:41.213183 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd\": container with ID starting with ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd not found: ID does not exist" containerID="ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.213213 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd"} err="failed to get container status \"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd\": rpc error: code = NotFound desc = could not find container \"ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd\": container with ID starting with ef2c803dc2558fffd4a712283489683c011c288c6c831f263a75df0c282b16fd not found: ID does not exist" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.213237 4755 scope.go:117] "RemoveContainer" containerID="7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6" Mar 17 02:31:41 crc kubenswrapper[4755]: E0317 02:31:41.213918 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6\": container with ID starting with 7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6 not found: ID does not exist" containerID="7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.213941 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6"} err="failed to get container status \"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6\": rpc error: code = NotFound desc = could not find container \"7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6\": container with ID starting with 7b1241eb36323f879225cf8932c7564216b22435623931d5d9efb5d87a4611b6 not found: ID does not exist" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.213957 4755 scope.go:117] "RemoveContainer" containerID="72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811" Mar 17 02:31:41 crc kubenswrapper[4755]: E0317 02:31:41.214406 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811\": container with ID starting with 72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811 not found: ID does not exist" containerID="72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811" Mar 17 02:31:41 crc kubenswrapper[4755]: I0317 02:31:41.214425 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811"} err="failed to get container status \"72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811\": rpc error: code = NotFound desc = could not find container \"72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811\": container with ID starting with 72c6d0b331b841cc527eca7c2021713ce25f14792113a51825d46d5f333ec811 not found: ID does not exist" Mar 17 02:31:42 crc kubenswrapper[4755]: I0317 02:31:42.261828 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" path="/var/lib/kubelet/pods/dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc/volumes" Mar 17 02:31:45 crc kubenswrapper[4755]: I0317 02:31:45.121406 4755 generic.go:334] "Generic (PLEG): container finished" podID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerID="eac08d5f39ffe41a82694570c2964058d5e5e6a6d3864bc60b6fd305ab1bc1ec" exitCode=0 Mar 17 02:31:45 crc kubenswrapper[4755]: I0317 02:31:45.121546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" event={"ID":"e13d9991-5993-4f00-9918-5ec7ad366a9f","Type":"ContainerDied","Data":"eac08d5f39ffe41a82694570c2964058d5e5e6a6d3864bc60b6fd305ab1bc1ec"} Mar 17 02:31:45 crc kubenswrapper[4755]: I0317 02:31:45.122255 4755 scope.go:117] "RemoveContainer" containerID="eac08d5f39ffe41a82694570c2964058d5e5e6a6d3864bc60b6fd305ab1bc1ec" Mar 17 02:31:45 crc kubenswrapper[4755]: I0317 02:31:45.429832 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n8w6v_must-gather-m5qx9_e13d9991-5993-4f00-9918-5ec7ad366a9f/gather/0.log" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.552464 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:31:48 crc kubenswrapper[4755]: E0317 02:31:48.553501 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="extract-content" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.553516 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="extract-content" Mar 17 02:31:48 crc kubenswrapper[4755]: E0317 02:31:48.553557 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="extract-utilities" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.553565 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="extract-utilities" Mar 17 02:31:48 crc kubenswrapper[4755]: E0317 02:31:48.553584 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="registry-server" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.553592 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="registry-server" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.553968 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab0fe7c-64dc-40d2-935e-17f5c3a5c7bc" containerName="registry-server" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.556112 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.580348 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.687239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs79\" (UniqueName: \"kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.687338 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.688031 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.790419 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs79\" (UniqueName: \"kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.790559 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.790696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.791353 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.791380 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.827892 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs79\" (UniqueName: \"kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79\") pod \"redhat-operators-hpbt5\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:48 crc kubenswrapper[4755]: I0317 02:31:48.876099 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:49 crc kubenswrapper[4755]: I0317 02:31:49.437517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:31:50 crc kubenswrapper[4755]: I0317 02:31:50.187350 4755 generic.go:334] "Generic (PLEG): container finished" podID="49a7d298-cb80-4170-a34b-783dab04a19d" containerID="e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8" exitCode=0 Mar 17 02:31:50 crc kubenswrapper[4755]: I0317 02:31:50.187465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerDied","Data":"e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8"} Mar 17 02:31:50 crc kubenswrapper[4755]: I0317 02:31:50.187686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerStarted","Data":"fbf217c993ec18f4d811ed64221a3d8d22aa61050a6310633694f8aa2d753c8c"} Mar 17 02:31:51 crc kubenswrapper[4755]: I0317 02:31:51.204683 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerStarted","Data":"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca"} Mar 17 02:31:55 crc kubenswrapper[4755]: I0317 02:31:55.773602 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n8w6v/must-gather-m5qx9"] Mar 17 02:31:55 crc kubenswrapper[4755]: I0317 02:31:55.775237 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="copy" containerID="cri-o://ab28e43539fc149171fd63bde026497e6cdc722455cedcce9a5fd5ad1e821fe5" gracePeriod=2 Mar 17 02:31:55 crc kubenswrapper[4755]: I0317 02:31:55.791213 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n8w6v/must-gather-m5qx9"] Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.271925 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n8w6v_must-gather-m5qx9_e13d9991-5993-4f00-9918-5ec7ad366a9f/copy/0.log" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.273330 4755 generic.go:334] "Generic (PLEG): container finished" podID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerID="ab28e43539fc149171fd63bde026497e6cdc722455cedcce9a5fd5ad1e821fe5" exitCode=143 Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.277258 4755 generic.go:334] "Generic (PLEG): container finished" podID="49a7d298-cb80-4170-a34b-783dab04a19d" containerID="0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca" exitCode=0 Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.277409 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerDied","Data":"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca"} Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.396568 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n8w6v_must-gather-m5qx9_e13d9991-5993-4f00-9918-5ec7ad366a9f/copy/0.log" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.396965 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.482701 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output\") pod \"e13d9991-5993-4f00-9918-5ec7ad366a9f\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.482812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q7tr\" (UniqueName: \"kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr\") pod \"e13d9991-5993-4f00-9918-5ec7ad366a9f\" (UID: \"e13d9991-5993-4f00-9918-5ec7ad366a9f\") " Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.504170 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr" (OuterVolumeSpecName: "kube-api-access-2q7tr") pod "e13d9991-5993-4f00-9918-5ec7ad366a9f" (UID: "e13d9991-5993-4f00-9918-5ec7ad366a9f"). InnerVolumeSpecName "kube-api-access-2q7tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.586796 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q7tr\" (UniqueName: \"kubernetes.io/projected/e13d9991-5993-4f00-9918-5ec7ad366a9f-kube-api-access-2q7tr\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.671140 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e13d9991-5993-4f00-9918-5ec7ad366a9f" (UID: "e13d9991-5993-4f00-9918-5ec7ad366a9f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:31:56 crc kubenswrapper[4755]: I0317 02:31:56.688608 4755 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e13d9991-5993-4f00-9918-5ec7ad366a9f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.287073 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n8w6v_must-gather-m5qx9_e13d9991-5993-4f00-9918-5ec7ad366a9f/copy/0.log" Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.287768 4755 scope.go:117] "RemoveContainer" containerID="ab28e43539fc149171fd63bde026497e6cdc722455cedcce9a5fd5ad1e821fe5" Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.287883 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n8w6v/must-gather-m5qx9" Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.290809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerStarted","Data":"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31"} Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.327016 4755 scope.go:117] "RemoveContainer" containerID="eac08d5f39ffe41a82694570c2964058d5e5e6a6d3864bc60b6fd305ab1bc1ec" Mar 17 02:31:57 crc kubenswrapper[4755]: I0317 02:31:57.363174 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hpbt5" podStartSLOduration=2.808128974 podStartE2EDuration="9.363153598s" podCreationTimestamp="2026-03-17 02:31:48 +0000 UTC" firstStartedPulling="2026-03-17 02:31:50.190558514 +0000 UTC m=+7784.950010817" lastFinishedPulling="2026-03-17 02:31:56.745583148 +0000 UTC m=+7791.505035441" observedRunningTime="2026-03-17 02:31:57.361101502 +0000 UTC m=+7792.120553805" watchObservedRunningTime="2026-03-17 02:31:57.363153598 +0000 UTC m=+7792.122605881" Mar 17 02:31:58 crc kubenswrapper[4755]: I0317 02:31:58.260329 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" path="/var/lib/kubelet/pods/e13d9991-5993-4f00-9918-5ec7ad366a9f/volumes" Mar 17 02:31:58 crc kubenswrapper[4755]: I0317 02:31:58.877675 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:58 crc kubenswrapper[4755]: I0317 02:31:58.877746 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:31:59 crc kubenswrapper[4755]: I0317 02:31:59.938221 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hpbt5" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" probeResult="failure" output=< Mar 17 02:31:59 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:31:59 crc kubenswrapper[4755]: > Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.178345 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561912-48pm5"] Mar 17 02:32:00 crc kubenswrapper[4755]: E0317 02:32:00.178789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="copy" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.178805 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="copy" Mar 17 02:32:00 crc kubenswrapper[4755]: E0317 02:32:00.178842 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="gather" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.178848 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="gather" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.179040 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="gather" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.179065 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13d9991-5993-4f00-9918-5ec7ad366a9f" containerName="copy" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.179805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.183290 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.183483 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.186628 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.195456 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-48pm5"] Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.302790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4rv\" (UniqueName: \"kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv\") pod \"auto-csr-approver-29561912-48pm5\" (UID: \"b04f4bda-ca2e-46f8-bdc5-249520e25aca\") " pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.405018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4rv\" (UniqueName: \"kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv\") pod \"auto-csr-approver-29561912-48pm5\" (UID: \"b04f4bda-ca2e-46f8-bdc5-249520e25aca\") " pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.433242 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4rv\" (UniqueName: \"kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv\") pod \"auto-csr-approver-29561912-48pm5\" (UID: \"b04f4bda-ca2e-46f8-bdc5-249520e25aca\") " pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:00 crc kubenswrapper[4755]: I0317 02:32:00.504802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:01 crc kubenswrapper[4755]: I0317 02:32:01.016245 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-48pm5"] Mar 17 02:32:01 crc kubenswrapper[4755]: I0317 02:32:01.337516 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-48pm5" event={"ID":"b04f4bda-ca2e-46f8-bdc5-249520e25aca","Type":"ContainerStarted","Data":"a71bb8dd0fac77e1b87b39b08b07ef1578e307f011eab65e6a5cbc0d51fdb340"} Mar 17 02:32:03 crc kubenswrapper[4755]: I0317 02:32:03.363309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-48pm5" event={"ID":"b04f4bda-ca2e-46f8-bdc5-249520e25aca","Type":"ContainerStarted","Data":"5a0ea8ce9306207b75d73f362a6cdd29242a2414219249cf20f776f201abadb5"} Mar 17 02:32:03 crc kubenswrapper[4755]: I0317 02:32:03.391898 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561912-48pm5" podStartSLOduration=2.208167362 podStartE2EDuration="3.391872514s" podCreationTimestamp="2026-03-17 02:32:00 +0000 UTC" firstStartedPulling="2026-03-17 02:32:01.01336465 +0000 UTC m=+7795.772816923" lastFinishedPulling="2026-03-17 02:32:02.197069792 +0000 UTC m=+7796.956522075" observedRunningTime="2026-03-17 02:32:03.385539503 +0000 UTC m=+7798.144991826" watchObservedRunningTime="2026-03-17 02:32:03.391872514 +0000 UTC m=+7798.151324847" Mar 17 02:32:04 crc kubenswrapper[4755]: I0317 02:32:04.382346 4755 generic.go:334] "Generic (PLEG): container finished" podID="b04f4bda-ca2e-46f8-bdc5-249520e25aca" containerID="5a0ea8ce9306207b75d73f362a6cdd29242a2414219249cf20f776f201abadb5" exitCode=0 Mar 17 02:32:04 crc kubenswrapper[4755]: I0317 02:32:04.382514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-48pm5" event={"ID":"b04f4bda-ca2e-46f8-bdc5-249520e25aca","Type":"ContainerDied","Data":"5a0ea8ce9306207b75d73f362a6cdd29242a2414219249cf20f776f201abadb5"} Mar 17 02:32:05 crc kubenswrapper[4755]: I0317 02:32:05.857351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:05 crc kubenswrapper[4755]: I0317 02:32:05.940648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g4rv\" (UniqueName: \"kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv\") pod \"b04f4bda-ca2e-46f8-bdc5-249520e25aca\" (UID: \"b04f4bda-ca2e-46f8-bdc5-249520e25aca\") " Mar 17 02:32:05 crc kubenswrapper[4755]: I0317 02:32:05.949213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv" (OuterVolumeSpecName: "kube-api-access-5g4rv") pod "b04f4bda-ca2e-46f8-bdc5-249520e25aca" (UID: "b04f4bda-ca2e-46f8-bdc5-249520e25aca"). InnerVolumeSpecName "kube-api-access-5g4rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.043454 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g4rv\" (UniqueName: \"kubernetes.io/projected/b04f4bda-ca2e-46f8-bdc5-249520e25aca-kube-api-access-5g4rv\") on node \"crc\" DevicePath \"\"" Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.406023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561912-48pm5" event={"ID":"b04f4bda-ca2e-46f8-bdc5-249520e25aca","Type":"ContainerDied","Data":"a71bb8dd0fac77e1b87b39b08b07ef1578e307f011eab65e6a5cbc0d51fdb340"} Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.406063 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a71bb8dd0fac77e1b87b39b08b07ef1578e307f011eab65e6a5cbc0d51fdb340" Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.406112 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561912-48pm5" Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.445643 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-n5qcz"] Mar 17 02:32:06 crc kubenswrapper[4755]: I0317 02:32:06.454636 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561906-n5qcz"] Mar 17 02:32:08 crc kubenswrapper[4755]: I0317 02:32:08.265893 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ae8f94-07ee-49f0-939f-a1355a3d9657" path="/var/lib/kubelet/pods/69ae8f94-07ee-49f0-939f-a1355a3d9657/volumes" Mar 17 02:32:09 crc kubenswrapper[4755]: I0317 02:32:09.970844 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hpbt5" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" probeResult="failure" output=< Mar 17 02:32:09 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:32:09 crc kubenswrapper[4755]: > Mar 17 02:32:19 crc kubenswrapper[4755]: I0317 02:32:19.961716 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hpbt5" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" probeResult="failure" output=< Mar 17 02:32:19 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:32:19 crc kubenswrapper[4755]: > Mar 17 02:32:21 crc kubenswrapper[4755]: I0317 02:32:21.928178 4755 scope.go:117] "RemoveContainer" containerID="03e2dfe075f5cac06a02bba96ad2a3d4386fd427e8e166cc37239b92e4999570" Mar 17 02:32:29 crc kubenswrapper[4755]: I0317 02:32:29.950488 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hpbt5" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" probeResult="failure" output=< Mar 17 02:32:29 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:32:29 crc kubenswrapper[4755]: > Mar 17 02:32:38 crc kubenswrapper[4755]: I0317 02:32:38.967519 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:32:39 crc kubenswrapper[4755]: I0317 02:32:39.033928 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:32:39 crc kubenswrapper[4755]: I0317 02:32:39.215913 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:32:40 crc kubenswrapper[4755]: I0317 02:32:40.852537 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hpbt5" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" containerID="cri-o://ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31" gracePeriod=2 Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.583314 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.734954 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frs79\" (UniqueName: \"kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79\") pod \"49a7d298-cb80-4170-a34b-783dab04a19d\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.735125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content\") pod \"49a7d298-cb80-4170-a34b-783dab04a19d\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.735293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities\") pod \"49a7d298-cb80-4170-a34b-783dab04a19d\" (UID: \"49a7d298-cb80-4170-a34b-783dab04a19d\") " Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.737500 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities" (OuterVolumeSpecName: "utilities") pod "49a7d298-cb80-4170-a34b-783dab04a19d" (UID: "49a7d298-cb80-4170-a34b-783dab04a19d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.742924 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79" (OuterVolumeSpecName: "kube-api-access-frs79") pod "49a7d298-cb80-4170-a34b-783dab04a19d" (UID: "49a7d298-cb80-4170-a34b-783dab04a19d"). InnerVolumeSpecName "kube-api-access-frs79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.838607 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frs79\" (UniqueName: \"kubernetes.io/projected/49a7d298-cb80-4170-a34b-783dab04a19d-kube-api-access-frs79\") on node \"crc\" DevicePath \"\"" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.838886 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.867591 4755 generic.go:334] "Generic (PLEG): container finished" podID="49a7d298-cb80-4170-a34b-783dab04a19d" containerID="ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31" exitCode=0 Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.867649 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerDied","Data":"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31"} Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.867692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpbt5" event={"ID":"49a7d298-cb80-4170-a34b-783dab04a19d","Type":"ContainerDied","Data":"fbf217c993ec18f4d811ed64221a3d8d22aa61050a6310633694f8aa2d753c8c"} Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.867701 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpbt5" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.867713 4755 scope.go:117] "RemoveContainer" containerID="ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.886191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a7d298-cb80-4170-a34b-783dab04a19d" (UID: "49a7d298-cb80-4170-a34b-783dab04a19d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.896708 4755 scope.go:117] "RemoveContainer" containerID="0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.926608 4755 scope.go:117] "RemoveContainer" containerID="e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8" Mar 17 02:32:41 crc kubenswrapper[4755]: I0317 02:32:41.941591 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a7d298-cb80-4170-a34b-783dab04a19d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.010750 4755 scope.go:117] "RemoveContainer" containerID="ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31" Mar 17 02:32:42 crc kubenswrapper[4755]: E0317 02:32:42.011257 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31\": container with ID starting with ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31 not found: ID does not exist" containerID="ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.011319 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31"} err="failed to get container status \"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31\": rpc error: code = NotFound desc = could not find container \"ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31\": container with ID starting with ed239f787db6163a339434a1cac9c3a84ad553285020dbc7c90fc0294b415a31 not found: ID does not exist" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.011358 4755 scope.go:117] "RemoveContainer" containerID="0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca" Mar 17 02:32:42 crc kubenswrapper[4755]: E0317 02:32:42.011908 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca\": container with ID starting with 0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca not found: ID does not exist" containerID="0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.011944 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca"} err="failed to get container status \"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca\": rpc error: code = NotFound desc = could not find container \"0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca\": container with ID starting with 0203220f7e9862051291a8d9f78fddaba30d681129672d8c909779115ac896ca not found: ID does not exist" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.011970 4755 scope.go:117] "RemoveContainer" containerID="e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8" Mar 17 02:32:42 crc kubenswrapper[4755]: E0317 02:32:42.012345 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8\": container with ID starting with e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8 not found: ID does not exist" containerID="e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.012469 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8"} err="failed to get container status \"e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8\": rpc error: code = NotFound desc = could not find container \"e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8\": container with ID starting with e218f58ac7908542dc89a4cc9d442116960076827c44347063103fb8109a92d8 not found: ID does not exist" Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.211693 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.225500 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hpbt5"] Mar 17 02:32:42 crc kubenswrapper[4755]: I0317 02:32:42.265144 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" path="/var/lib/kubelet/pods/49a7d298-cb80-4170-a34b-783dab04a19d/volumes" Mar 17 02:32:58 crc kubenswrapper[4755]: I0317 02:32:58.664987 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:32:58 crc kubenswrapper[4755]: I0317 02:32:58.666804 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:33:28 crc kubenswrapper[4755]: I0317 02:33:28.665585 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:33:28 crc kubenswrapper[4755]: I0317 02:33:28.666251 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.664744 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.665327 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.665385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.666924 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.667015 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548" gracePeriod=600 Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.924174 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548" exitCode=0 Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.924327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548"} Mar 17 02:33:58 crc kubenswrapper[4755]: I0317 02:33:58.924490 4755 scope.go:117] "RemoveContainer" containerID="c35d193d2a1ce211b0e9b6e5d47b732248a6b3d4e6cafeb27aa0b000bd3f8943" Mar 17 02:33:59 crc kubenswrapper[4755]: I0317 02:33:59.937949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d"} Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.150367 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561914-pj7hp"] Mar 17 02:34:00 crc kubenswrapper[4755]: E0317 02:34:00.151029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151044 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" Mar 17 02:34:00 crc kubenswrapper[4755]: E0317 02:34:00.151075 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04f4bda-ca2e-46f8-bdc5-249520e25aca" containerName="oc" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151084 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04f4bda-ca2e-46f8-bdc5-249520e25aca" containerName="oc" Mar 17 02:34:00 crc kubenswrapper[4755]: E0317 02:34:00.151111 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="extract-content" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="extract-content" Mar 17 02:34:00 crc kubenswrapper[4755]: E0317 02:34:00.151158 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="extract-utilities" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151167 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="extract-utilities" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151404 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04f4bda-ca2e-46f8-bdc5-249520e25aca" containerName="oc" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.151471 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a7d298-cb80-4170-a34b-783dab04a19d" containerName="registry-server" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.152410 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.154639 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.154813 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.155243 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.162421 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-pj7hp"] Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.211252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k777w\" (UniqueName: \"kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w\") pod \"auto-csr-approver-29561914-pj7hp\" (UID: \"d86c2427-5a6a-418d-891c-3fb21f54b8c5\") " pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.312308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k777w\" (UniqueName: \"kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w\") pod \"auto-csr-approver-29561914-pj7hp\" (UID: \"d86c2427-5a6a-418d-891c-3fb21f54b8c5\") " pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.330798 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k777w\" (UniqueName: \"kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w\") pod \"auto-csr-approver-29561914-pj7hp\" (UID: \"d86c2427-5a6a-418d-891c-3fb21f54b8c5\") " pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:00 crc kubenswrapper[4755]: I0317 02:34:00.519291 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:01 crc kubenswrapper[4755]: I0317 02:34:01.040259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-pj7hp"] Mar 17 02:34:01 crc kubenswrapper[4755]: I0317 02:34:01.964194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" event={"ID":"d86c2427-5a6a-418d-891c-3fb21f54b8c5","Type":"ContainerStarted","Data":"d28917d0d756d7d972eec27446d3612979bb6c18fb8511dfeed767934fea7881"} Mar 17 02:34:02 crc kubenswrapper[4755]: I0317 02:34:02.980234 4755 generic.go:334] "Generic (PLEG): container finished" podID="d86c2427-5a6a-418d-891c-3fb21f54b8c5" containerID="914629b879981692b45abfa7ea4a2400c4277408ada9f2172d34a62eb8d2bdf8" exitCode=0 Mar 17 02:34:02 crc kubenswrapper[4755]: I0317 02:34:02.980341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" event={"ID":"d86c2427-5a6a-418d-891c-3fb21f54b8c5","Type":"ContainerDied","Data":"914629b879981692b45abfa7ea4a2400c4277408ada9f2172d34a62eb8d2bdf8"} Mar 17 02:34:04 crc kubenswrapper[4755]: I0317 02:34:04.391967 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:04 crc kubenswrapper[4755]: I0317 02:34:04.502679 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k777w\" (UniqueName: \"kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w\") pod \"d86c2427-5a6a-418d-891c-3fb21f54b8c5\" (UID: \"d86c2427-5a6a-418d-891c-3fb21f54b8c5\") " Mar 17 02:34:04 crc kubenswrapper[4755]: I0317 02:34:04.514254 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w" (OuterVolumeSpecName: "kube-api-access-k777w") pod "d86c2427-5a6a-418d-891c-3fb21f54b8c5" (UID: "d86c2427-5a6a-418d-891c-3fb21f54b8c5"). InnerVolumeSpecName "kube-api-access-k777w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:34:04 crc kubenswrapper[4755]: I0317 02:34:04.606075 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k777w\" (UniqueName: \"kubernetes.io/projected/d86c2427-5a6a-418d-891c-3fb21f54b8c5-kube-api-access-k777w\") on node \"crc\" DevicePath \"\"" Mar 17 02:34:05 crc kubenswrapper[4755]: I0317 02:34:05.003305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" event={"ID":"d86c2427-5a6a-418d-891c-3fb21f54b8c5","Type":"ContainerDied","Data":"d28917d0d756d7d972eec27446d3612979bb6c18fb8511dfeed767934fea7881"} Mar 17 02:34:05 crc kubenswrapper[4755]: I0317 02:34:05.003738 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d28917d0d756d7d972eec27446d3612979bb6c18fb8511dfeed767934fea7881" Mar 17 02:34:05 crc kubenswrapper[4755]: I0317 02:34:05.003415 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561914-pj7hp" Mar 17 02:34:05 crc kubenswrapper[4755]: I0317 02:34:05.489597 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-vxrxn"] Mar 17 02:34:05 crc kubenswrapper[4755]: I0317 02:34:05.502479 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561908-vxrxn"] Mar 17 02:34:06 crc kubenswrapper[4755]: I0317 02:34:06.266772 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300e3839-ff11-45c8-b162-6c3646fa4173" path="/var/lib/kubelet/pods/300e3839-ff11-45c8-b162-6c3646fa4173/volumes" Mar 17 02:34:22 crc kubenswrapper[4755]: I0317 02:34:22.162961 4755 scope.go:117] "RemoveContainer" containerID="61da04e9e673b3c22b25ad8b4a49dd934213af28cf63d0d988a547b19b8951c0" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.521093 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4ksgn/must-gather-mgjmq"] Mar 17 02:35:08 crc kubenswrapper[4755]: E0317 02:35:08.521971 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86c2427-5a6a-418d-891c-3fb21f54b8c5" containerName="oc" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.521985 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86c2427-5a6a-418d-891c-3fb21f54b8c5" containerName="oc" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.522211 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86c2427-5a6a-418d-891c-3fb21f54b8c5" containerName="oc" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.534668 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.541033 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4ksgn"/"kube-root-ca.crt" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.541295 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4ksgn"/"openshift-service-ca.crt" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.555366 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4ksgn/must-gather-mgjmq"] Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.566247 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwm8q\" (UniqueName: \"kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.566636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.669658 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.669942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwm8q\" (UniqueName: \"kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.670552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.694207 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwm8q\" (UniqueName: \"kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q\") pod \"must-gather-mgjmq\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:08 crc kubenswrapper[4755]: I0317 02:35:08.883732 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:35:09 crc kubenswrapper[4755]: I0317 02:35:09.408407 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4ksgn/must-gather-mgjmq"] Mar 17 02:35:09 crc kubenswrapper[4755]: I0317 02:35:09.801109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" event={"ID":"17133d6e-4462-478a-a0d9-7be42c11c490","Type":"ContainerStarted","Data":"91985e41a6e8fd549b14ff60e20920836a71b3aea61278b681708512ba30f29f"} Mar 17 02:35:10 crc kubenswrapper[4755]: I0317 02:35:10.812259 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" event={"ID":"17133d6e-4462-478a-a0d9-7be42c11c490","Type":"ContainerStarted","Data":"749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79"} Mar 17 02:35:10 crc kubenswrapper[4755]: I0317 02:35:10.812781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" event={"ID":"17133d6e-4462-478a-a0d9-7be42c11c490","Type":"ContainerStarted","Data":"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06"} Mar 17 02:35:10 crc kubenswrapper[4755]: I0317 02:35:10.840041 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" podStartSLOduration=2.840022476 podStartE2EDuration="2.840022476s" podCreationTimestamp="2026-03-17 02:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:35:10.838430593 +0000 UTC m=+7985.597882916" watchObservedRunningTime="2026-03-17 02:35:10.840022476 +0000 UTC m=+7985.599474749" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.355830 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-7fv8m"] Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.357647 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.362604 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4ksgn"/"default-dockercfg-4kdj8" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.416636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.416855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44t8\" (UniqueName: \"kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.518520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44t8\" (UniqueName: \"kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.518617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.520427 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.538629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44t8\" (UniqueName: \"kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8\") pod \"crc-debug-7fv8m\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.680389 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:15 crc kubenswrapper[4755]: W0317 02:35:15.730823 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18652bd2_a806_47fc_b3bb_47bb0109dbec.slice/crio-fdd08f0491b24f70b196d32f6ae41ca9ba4d38d6916edb6c5edac169cdf161eb WatchSource:0}: Error finding container fdd08f0491b24f70b196d32f6ae41ca9ba4d38d6916edb6c5edac169cdf161eb: Status 404 returned error can't find the container with id fdd08f0491b24f70b196d32f6ae41ca9ba4d38d6916edb6c5edac169cdf161eb Mar 17 02:35:15 crc kubenswrapper[4755]: I0317 02:35:15.899508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" event={"ID":"18652bd2-a806-47fc-b3bb-47bb0109dbec","Type":"ContainerStarted","Data":"fdd08f0491b24f70b196d32f6ae41ca9ba4d38d6916edb6c5edac169cdf161eb"} Mar 17 02:35:16 crc kubenswrapper[4755]: I0317 02:35:16.910015 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" event={"ID":"18652bd2-a806-47fc-b3bb-47bb0109dbec","Type":"ContainerStarted","Data":"f139699882bf226642d4db5c5a850f896590c47c576d8345edb976bfada8ec7b"} Mar 17 02:35:16 crc kubenswrapper[4755]: I0317 02:35:16.973023 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" podStartSLOduration=1.973002185 podStartE2EDuration="1.973002185s" podCreationTimestamp="2026-03-17 02:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:35:16.931822454 +0000 UTC m=+7991.691274737" watchObservedRunningTime="2026-03-17 02:35:16.973002185 +0000 UTC m=+7991.732454468" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.470106 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.485207 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.531268 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.566741 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.566804 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvqk\" (UniqueName: \"kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.566995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.668759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.669088 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvqk\" (UniqueName: \"kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.669220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.669233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.669808 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.698223 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvqk\" (UniqueName: \"kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk\") pod \"certified-operators-hjqtd\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:40 crc kubenswrapper[4755]: I0317 02:35:40.868366 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:41 crc kubenswrapper[4755]: I0317 02:35:41.295685 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:35:42 crc kubenswrapper[4755]: I0317 02:35:42.219166 4755 generic.go:334] "Generic (PLEG): container finished" podID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerID="da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e" exitCode=0 Mar 17 02:35:42 crc kubenswrapper[4755]: I0317 02:35:42.219335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerDied","Data":"da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e"} Mar 17 02:35:42 crc kubenswrapper[4755]: I0317 02:35:42.219567 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerStarted","Data":"3c4566be1ea14d403e7b944000d84862cdb1287fd7ce9fa4089ed02727583265"} Mar 17 02:35:43 crc kubenswrapper[4755]: I0317 02:35:43.232148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerStarted","Data":"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df"} Mar 17 02:35:45 crc kubenswrapper[4755]: I0317 02:35:45.270517 4755 generic.go:334] "Generic (PLEG): container finished" podID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerID="cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df" exitCode=0 Mar 17 02:35:45 crc kubenswrapper[4755]: I0317 02:35:45.271351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerDied","Data":"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df"} Mar 17 02:35:46 crc kubenswrapper[4755]: I0317 02:35:46.305099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerStarted","Data":"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de"} Mar 17 02:35:46 crc kubenswrapper[4755]: I0317 02:35:46.334666 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hjqtd" podStartSLOduration=2.574028892 podStartE2EDuration="6.334646362s" podCreationTimestamp="2026-03-17 02:35:40 +0000 UTC" firstStartedPulling="2026-03-17 02:35:42.226165128 +0000 UTC m=+8016.985617411" lastFinishedPulling="2026-03-17 02:35:45.986782598 +0000 UTC m=+8020.746234881" observedRunningTime="2026-03-17 02:35:46.329994377 +0000 UTC m=+8021.089446660" watchObservedRunningTime="2026-03-17 02:35:46.334646362 +0000 UTC m=+8021.094098645" Mar 17 02:35:50 crc kubenswrapper[4755]: I0317 02:35:50.868988 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:50 crc kubenswrapper[4755]: I0317 02:35:50.869458 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:35:51 crc kubenswrapper[4755]: I0317 02:35:51.931621 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hjqtd" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="registry-server" probeResult="failure" output=< Mar 17 02:35:51 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:35:51 crc kubenswrapper[4755]: > Mar 17 02:35:58 crc kubenswrapper[4755]: I0317 02:35:58.419680 4755 generic.go:334] "Generic (PLEG): container finished" podID="18652bd2-a806-47fc-b3bb-47bb0109dbec" containerID="f139699882bf226642d4db5c5a850f896590c47c576d8345edb976bfada8ec7b" exitCode=0 Mar 17 02:35:58 crc kubenswrapper[4755]: I0317 02:35:58.419769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" event={"ID":"18652bd2-a806-47fc-b3bb-47bb0109dbec","Type":"ContainerDied","Data":"f139699882bf226642d4db5c5a850f896590c47c576d8345edb976bfada8ec7b"} Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.543504 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.592072 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-7fv8m"] Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.600731 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44t8\" (UniqueName: \"kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8\") pod \"18652bd2-a806-47fc-b3bb-47bb0109dbec\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.600862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host\") pod \"18652bd2-a806-47fc-b3bb-47bb0109dbec\" (UID: \"18652bd2-a806-47fc-b3bb-47bb0109dbec\") " Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.601650 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host" (OuterVolumeSpecName: "host") pod "18652bd2-a806-47fc-b3bb-47bb0109dbec" (UID: "18652bd2-a806-47fc-b3bb-47bb0109dbec"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.601967 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-7fv8m"] Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.621177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8" (OuterVolumeSpecName: "kube-api-access-w44t8") pod "18652bd2-a806-47fc-b3bb-47bb0109dbec" (UID: "18652bd2-a806-47fc-b3bb-47bb0109dbec"). InnerVolumeSpecName "kube-api-access-w44t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.703287 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w44t8\" (UniqueName: \"kubernetes.io/projected/18652bd2-a806-47fc-b3bb-47bb0109dbec-kube-api-access-w44t8\") on node \"crc\" DevicePath \"\"" Mar 17 02:35:59 crc kubenswrapper[4755]: I0317 02:35:59.703328 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18652bd2-a806-47fc-b3bb-47bb0109dbec-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.158860 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561916-7qb2v"] Mar 17 02:36:00 crc kubenswrapper[4755]: E0317 02:36:00.159548 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18652bd2-a806-47fc-b3bb-47bb0109dbec" containerName="container-00" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.159565 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="18652bd2-a806-47fc-b3bb-47bb0109dbec" containerName="container-00" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.159867 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="18652bd2-a806-47fc-b3bb-47bb0109dbec" containerName="container-00" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.160574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.163295 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.163557 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.163830 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.177595 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-7qb2v"] Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.213877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52g4l\" (UniqueName: \"kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l\") pod \"auto-csr-approver-29561916-7qb2v\" (UID: \"f5109517-910d-4f76-899d-2749000ed633\") " pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.268466 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18652bd2-a806-47fc-b3bb-47bb0109dbec" path="/var/lib/kubelet/pods/18652bd2-a806-47fc-b3bb-47bb0109dbec/volumes" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.315918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52g4l\" (UniqueName: \"kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l\") pod \"auto-csr-approver-29561916-7qb2v\" (UID: \"f5109517-910d-4f76-899d-2749000ed633\") " pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.333835 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52g4l\" (UniqueName: \"kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l\") pod \"auto-csr-approver-29561916-7qb2v\" (UID: \"f5109517-910d-4f76-899d-2749000ed633\") " pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.440977 4755 scope.go:117] "RemoveContainer" containerID="f139699882bf226642d4db5c5a850f896590c47c576d8345edb976bfada8ec7b" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.441039 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-7fv8m" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.482640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.861192 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-cvzmj"] Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.864136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.865911 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4ksgn"/"default-dockercfg-4kdj8" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.927931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.928016 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22b2b\" (UniqueName: \"kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.935945 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:36:00 crc kubenswrapper[4755]: I0317 02:36:00.990251 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.030056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.030150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22b2b\" (UniqueName: \"kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.030864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.052600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22b2b\" (UniqueName: \"kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b\") pod \"crc-debug-cvzmj\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.083317 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-7qb2v"] Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.172456 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.185066 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:01 crc kubenswrapper[4755]: W0317 02:36:01.213136 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57161278_b2d8_4abc_899e_1dd4776568bf.slice/crio-e212e51a1742321d0287e5436725d98e9d2f80d177943c27b73b3d46f3da9c8a WatchSource:0}: Error finding container e212e51a1742321d0287e5436725d98e9d2f80d177943c27b73b3d46f3da9c8a: Status 404 returned error can't find the container with id e212e51a1742321d0287e5436725d98e9d2f80d177943c27b73b3d46f3da9c8a Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.450553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" event={"ID":"57161278-b2d8-4abc-899e-1dd4776568bf","Type":"ContainerStarted","Data":"bb0141b3a05ee0ffc105364df59886aa101324e9cc2d1cc9b26ce6b50315b1f5"} Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.450611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" event={"ID":"57161278-b2d8-4abc-899e-1dd4776568bf","Type":"ContainerStarted","Data":"e212e51a1742321d0287e5436725d98e9d2f80d177943c27b73b3d46f3da9c8a"} Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.452969 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" event={"ID":"f5109517-910d-4f76-899d-2749000ed633","Type":"ContainerStarted","Data":"cea943c5c0b1f55e21e2fdd67f297f04345fc9a9b00260d054628dcd8998e4b2"} Mar 17 02:36:01 crc kubenswrapper[4755]: I0317 02:36:01.471145 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" podStartSLOduration=1.471125847 podStartE2EDuration="1.471125847s" podCreationTimestamp="2026-03-17 02:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:36:01.46529314 +0000 UTC m=+8036.224745423" watchObservedRunningTime="2026-03-17 02:36:01.471125847 +0000 UTC m=+8036.230578130" Mar 17 02:36:02 crc kubenswrapper[4755]: I0317 02:36:02.473511 4755 generic.go:334] "Generic (PLEG): container finished" podID="57161278-b2d8-4abc-899e-1dd4776568bf" containerID="bb0141b3a05ee0ffc105364df59886aa101324e9cc2d1cc9b26ce6b50315b1f5" exitCode=0 Mar 17 02:36:02 crc kubenswrapper[4755]: I0317 02:36:02.474120 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hjqtd" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="registry-server" containerID="cri-o://5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de" gracePeriod=2 Mar 17 02:36:02 crc kubenswrapper[4755]: I0317 02:36:02.474380 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" event={"ID":"57161278-b2d8-4abc-899e-1dd4776568bf","Type":"ContainerDied","Data":"bb0141b3a05ee0ffc105364df59886aa101324e9cc2d1cc9b26ce6b50315b1f5"} Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.072134 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.196977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities\") pod \"be4830d7-9731-4b72-9840-c6b9f03bce49\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.197250 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content\") pod \"be4830d7-9731-4b72-9840-c6b9f03bce49\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.197361 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgvqk\" (UniqueName: \"kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk\") pod \"be4830d7-9731-4b72-9840-c6b9f03bce49\" (UID: \"be4830d7-9731-4b72-9840-c6b9f03bce49\") " Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.202342 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities" (OuterVolumeSpecName: "utilities") pod "be4830d7-9731-4b72-9840-c6b9f03bce49" (UID: "be4830d7-9731-4b72-9840-c6b9f03bce49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.202681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk" (OuterVolumeSpecName: "kube-api-access-vgvqk") pod "be4830d7-9731-4b72-9840-c6b9f03bce49" (UID: "be4830d7-9731-4b72-9840-c6b9f03bce49"). InnerVolumeSpecName "kube-api-access-vgvqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.254031 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be4830d7-9731-4b72-9840-c6b9f03bce49" (UID: "be4830d7-9731-4b72-9840-c6b9f03bce49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.299757 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.299783 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be4830d7-9731-4b72-9840-c6b9f03bce49-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.299804 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgvqk\" (UniqueName: \"kubernetes.io/projected/be4830d7-9731-4b72-9840-c6b9f03bce49-kube-api-access-vgvqk\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.486083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" event={"ID":"f5109517-910d-4f76-899d-2749000ed633","Type":"ContainerStarted","Data":"39e6aaab3cfd985876d65e000bb307cf4d2cfc95e2fb0fd822cc699a2832d1d0"} Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.503955 4755 generic.go:334] "Generic (PLEG): container finished" podID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerID="5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de" exitCode=0 Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.504226 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hjqtd" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.504202 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerDied","Data":"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de"} Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.504388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hjqtd" event={"ID":"be4830d7-9731-4b72-9840-c6b9f03bce49","Type":"ContainerDied","Data":"3c4566be1ea14d403e7b944000d84862cdb1287fd7ce9fa4089ed02727583265"} Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.504425 4755 scope.go:117] "RemoveContainer" containerID="5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.516691 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" podStartSLOduration=2.5809288759999998 podStartE2EDuration="3.516671569s" podCreationTimestamp="2026-03-17 02:36:00 +0000 UTC" firstStartedPulling="2026-03-17 02:36:01.079591284 +0000 UTC m=+8035.839043567" lastFinishedPulling="2026-03-17 02:36:02.015333977 +0000 UTC m=+8036.774786260" observedRunningTime="2026-03-17 02:36:03.501193872 +0000 UTC m=+8038.260646165" watchObservedRunningTime="2026-03-17 02:36:03.516671569 +0000 UTC m=+8038.276123852" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.575951 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.577899 4755 scope.go:117] "RemoveContainer" containerID="cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.603920 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host\") pod \"57161278-b2d8-4abc-899e-1dd4776568bf\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.604141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22b2b\" (UniqueName: \"kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b\") pod \"57161278-b2d8-4abc-899e-1dd4776568bf\" (UID: \"57161278-b2d8-4abc-899e-1dd4776568bf\") " Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.604530 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host" (OuterVolumeSpecName: "host") pod "57161278-b2d8-4abc-899e-1dd4776568bf" (UID: "57161278-b2d8-4abc-899e-1dd4776568bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.605128 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57161278-b2d8-4abc-899e-1dd4776568bf-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.606868 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.608846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b" (OuterVolumeSpecName: "kube-api-access-22b2b") pod "57161278-b2d8-4abc-899e-1dd4776568bf" (UID: "57161278-b2d8-4abc-899e-1dd4776568bf"). InnerVolumeSpecName "kube-api-access-22b2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.633731 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hjqtd"] Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.670021 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-cvzmj"] Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.681038 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-cvzmj"] Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.701501 4755 scope.go:117] "RemoveContainer" containerID="da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.706710 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22b2b\" (UniqueName: \"kubernetes.io/projected/57161278-b2d8-4abc-899e-1dd4776568bf-kube-api-access-22b2b\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.752220 4755 scope.go:117] "RemoveContainer" containerID="5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de" Mar 17 02:36:03 crc kubenswrapper[4755]: E0317 02:36:03.752806 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de\": container with ID starting with 5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de not found: ID does not exist" containerID="5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.752851 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de"} err="failed to get container status \"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de\": rpc error: code = NotFound desc = could not find container \"5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de\": container with ID starting with 5c7b78f738f3d1b7ddd958b46c863d3e651c9509128f46ed75428f08162947de not found: ID does not exist" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.752876 4755 scope.go:117] "RemoveContainer" containerID="cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df" Mar 17 02:36:03 crc kubenswrapper[4755]: E0317 02:36:03.753302 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df\": container with ID starting with cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df not found: ID does not exist" containerID="cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.753450 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df"} err="failed to get container status \"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df\": rpc error: code = NotFound desc = could not find container \"cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df\": container with ID starting with cb521691eb366f940005fe15216174043ba6643568e433aca41079c03aa1a3df not found: ID does not exist" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.753573 4755 scope.go:117] "RemoveContainer" containerID="da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e" Mar 17 02:36:03 crc kubenswrapper[4755]: E0317 02:36:03.753924 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e\": container with ID starting with da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e not found: ID does not exist" containerID="da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e" Mar 17 02:36:03 crc kubenswrapper[4755]: I0317 02:36:03.753947 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e"} err="failed to get container status \"da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e\": rpc error: code = NotFound desc = could not find container \"da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e\": container with ID starting with da7f406357e8022633b47b5a74898e00469dc26b91022d08fa272b770e0ab55e not found: ID does not exist" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.260722 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57161278-b2d8-4abc-899e-1dd4776568bf" path="/var/lib/kubelet/pods/57161278-b2d8-4abc-899e-1dd4776568bf/volumes" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.261940 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" path="/var/lib/kubelet/pods/be4830d7-9731-4b72-9840-c6b9f03bce49/volumes" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.523425 4755 scope.go:117] "RemoveContainer" containerID="bb0141b3a05ee0ffc105364df59886aa101324e9cc2d1cc9b26ce6b50315b1f5" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.523483 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-cvzmj" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.525056 4755 generic.go:334] "Generic (PLEG): container finished" podID="f5109517-910d-4f76-899d-2749000ed633" containerID="39e6aaab3cfd985876d65e000bb307cf4d2cfc95e2fb0fd822cc699a2832d1d0" exitCode=0 Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.525085 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" event={"ID":"f5109517-910d-4f76-899d-2749000ed633","Type":"ContainerDied","Data":"39e6aaab3cfd985876d65e000bb307cf4d2cfc95e2fb0fd822cc699a2832d1d0"} Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.865868 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-fx94f"] Mar 17 02:36:04 crc kubenswrapper[4755]: E0317 02:36:04.866299 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="extract-utilities" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866316 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="extract-utilities" Mar 17 02:36:04 crc kubenswrapper[4755]: E0317 02:36:04.866341 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="extract-content" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866347 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="extract-content" Mar 17 02:36:04 crc kubenswrapper[4755]: E0317 02:36:04.866363 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="registry-server" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866371 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="registry-server" Mar 17 02:36:04 crc kubenswrapper[4755]: E0317 02:36:04.866398 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57161278-b2d8-4abc-899e-1dd4776568bf" containerName="container-00" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866404 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="57161278-b2d8-4abc-899e-1dd4776568bf" containerName="container-00" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866624 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="57161278-b2d8-4abc-899e-1dd4776568bf" containerName="container-00" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.866646 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4830d7-9731-4b72-9840-c6b9f03bce49" containerName="registry-server" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.867350 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.870322 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4ksgn"/"default-dockercfg-4kdj8" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.936063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:04 crc kubenswrapper[4755]: I0317 02:36:04.936253 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fgn\" (UniqueName: \"kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.038447 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fgn\" (UniqueName: \"kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.038576 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.038732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.061433 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fgn\" (UniqueName: \"kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn\") pod \"crc-debug-fx94f\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.186429 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:05 crc kubenswrapper[4755]: W0317 02:36:05.212989 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b3b1f0_b2ec_4893_8dd7_b93751c4d314.slice/crio-bdf4f7a99ba7b4f288f8b547f7229bf3cb87611fe9401c4990ef9361ffbd6d1d WatchSource:0}: Error finding container bdf4f7a99ba7b4f288f8b547f7229bf3cb87611fe9401c4990ef9361ffbd6d1d: Status 404 returned error can't find the container with id bdf4f7a99ba7b4f288f8b547f7229bf3cb87611fe9401c4990ef9361ffbd6d1d Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.537684 4755 generic.go:334] "Generic (PLEG): container finished" podID="a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" containerID="1462fe22e48972c1b36590c8bc09172d6289a20deb5bc28d1ecb47fe963b7280" exitCode=0 Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.537758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" event={"ID":"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314","Type":"ContainerDied","Data":"1462fe22e48972c1b36590c8bc09172d6289a20deb5bc28d1ecb47fe963b7280"} Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.538008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" event={"ID":"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314","Type":"ContainerStarted","Data":"bdf4f7a99ba7b4f288f8b547f7229bf3cb87611fe9401c4990ef9361ffbd6d1d"} Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.588109 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-fx94f"] Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.602174 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4ksgn/crc-debug-fx94f"] Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.912742 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.959384 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52g4l\" (UniqueName: \"kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l\") pod \"f5109517-910d-4f76-899d-2749000ed633\" (UID: \"f5109517-910d-4f76-899d-2749000ed633\") " Mar 17 02:36:05 crc kubenswrapper[4755]: I0317 02:36:05.967770 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l" (OuterVolumeSpecName: "kube-api-access-52g4l") pod "f5109517-910d-4f76-899d-2749000ed633" (UID: "f5109517-910d-4f76-899d-2749000ed633"). InnerVolumeSpecName "kube-api-access-52g4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.065640 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52g4l\" (UniqueName: \"kubernetes.io/projected/f5109517-910d-4f76-899d-2749000ed633-kube-api-access-52g4l\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.554649 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.555898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561916-7qb2v" event={"ID":"f5109517-910d-4f76-899d-2749000ed633","Type":"ContainerDied","Data":"cea943c5c0b1f55e21e2fdd67f297f04345fc9a9b00260d054628dcd8998e4b2"} Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.555989 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea943c5c0b1f55e21e2fdd67f297f04345fc9a9b00260d054628dcd8998e4b2" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.580831 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-ngp4q"] Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.603903 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561910-ngp4q"] Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.687306 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.779410 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22fgn\" (UniqueName: \"kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn\") pod \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.779504 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host\") pod \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\" (UID: \"a9b3b1f0-b2ec-4893-8dd7-b93751c4d314\") " Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.780070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host" (OuterVolumeSpecName: "host") pod "a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" (UID: "a9b3b1f0-b2ec-4893-8dd7-b93751c4d314"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.791039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn" (OuterVolumeSpecName: "kube-api-access-22fgn") pod "a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" (UID: "a9b3b1f0-b2ec-4893-8dd7-b93751c4d314"). InnerVolumeSpecName "kube-api-access-22fgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.882134 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22fgn\" (UniqueName: \"kubernetes.io/projected/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-kube-api-access-22fgn\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:06 crc kubenswrapper[4755]: I0317 02:36:06.882169 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314-host\") on node \"crc\" DevicePath \"\"" Mar 17 02:36:07 crc kubenswrapper[4755]: I0317 02:36:07.568333 4755 scope.go:117] "RemoveContainer" containerID="1462fe22e48972c1b36590c8bc09172d6289a20deb5bc28d1ecb47fe963b7280" Mar 17 02:36:07 crc kubenswrapper[4755]: I0317 02:36:07.568456 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/crc-debug-fx94f" Mar 17 02:36:08 crc kubenswrapper[4755]: I0317 02:36:08.258794 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98384819-daff-4e77-9f4b-ad906cda49e9" path="/var/lib/kubelet/pods/98384819-daff-4e77-9f4b-ad906cda49e9/volumes" Mar 17 02:36:08 crc kubenswrapper[4755]: I0317 02:36:08.259869 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" path="/var/lib/kubelet/pods/a9b3b1f0-b2ec-4893-8dd7-b93751c4d314/volumes" Mar 17 02:36:22 crc kubenswrapper[4755]: I0317 02:36:22.306233 4755 scope.go:117] "RemoveContainer" containerID="74804751037ee643935787dddf6bdff22015ec09987b66ec1819d859a1eb860b" Mar 17 02:36:28 crc kubenswrapper[4755]: I0317 02:36:28.665196 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:36:28 crc kubenswrapper[4755]: I0317 02:36:28.665907 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:36:58 crc kubenswrapper[4755]: I0317 02:36:58.665194 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:36:58 crc kubenswrapper[4755]: I0317 02:36:58.666181 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.226612 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-api/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.384226 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-evaluator/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.459857 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-listener/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.480903 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_60a167f8-f30c-4c3b-882e-4ebb8ab7e5e6/aodh-notifier/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.567021 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75cc5bb54-mqs8w_e975409f-81b6-4bcd-aec0-00f942eae3bd/barbican-api/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.700171 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75cc5bb54-mqs8w_e975409f-81b6-4bcd-aec0-00f942eae3bd/barbican-api-log/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.767834 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79df59b454-zg6ts_08712ad9-353f-4d69-aa69-87586a0b9ee3/barbican-keystone-listener/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.880670 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79df59b454-zg6ts_08712ad9-353f-4d69-aa69-87586a0b9ee3/barbican-keystone-listener-log/0.log" Mar 17 02:37:13 crc kubenswrapper[4755]: I0317 02:37:13.906310 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b59c459f5-btmbt_32b26d97-7256-4841-819f-2a2ee7ff2e3b/barbican-worker/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.006258 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b59c459f5-btmbt_32b26d97-7256-4841-819f-2a2ee7ff2e3b/barbican-worker-log/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.116989 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-d2swp_56db739c-5c0b-445c-bb95-d16d76daea1b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.278766 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/ceilometer-central-agent/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.325926 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/ceilometer-notification-agent/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.386274 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/proxy-httpd/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.439358 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2948ff39-a68c-4ef2-a7d7-8eb126261ff9/sg-core/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.530808 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-zjh8b_c924cf0b-5d1b-4d21-8123-106c71d3b94b/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.644738 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jv7p9_cb099246-365d-4bd7-ad54-f765ffc586cd/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.823559 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dff3597a-93e6-4bb6-9508-c8f4609a75fc/cinder-api/0.log" Mar 17 02:37:14 crc kubenswrapper[4755]: I0317 02:37:14.838343 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dff3597a-93e6-4bb6-9508-c8f4609a75fc/cinder-api-log/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.101996 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_fd360dd3-b439-453e-8543-405c8d1804b5/probe/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.241947 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5201f678-3b17-4d85-b341-2f789377dbaa/cinder-scheduler/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.243309 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_fd360dd3-b439-453e-8543-405c8d1804b5/cinder-backup/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.362695 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5201f678-3b17-4d85-b341-2f789377dbaa/probe/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.714984 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4abc0e8b-235e-48c1-8066-8958aa05a2a3/probe/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.792052 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4abc0e8b-235e-48c1-8066-8958aa05a2a3/cinder-volume/0.log" Mar 17 02:37:15 crc kubenswrapper[4755]: I0317 02:37:15.854324 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m4m5z_1b83398a-b089-4a14-9432-5154d7cd107c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.048884 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zqbx2_351689ec-5f29-4144-ab28-25abac57ccac/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.201327 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/init/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.466673 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/init/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.498287 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74cfff99f-cnbmp_e8c45f18-80d3-466b-9abe-ebb64d80c285/dnsmasq-dns/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.563057 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3953ffe-b583-483a-b3e4-8cb6393b09f7/glance-httpd/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.692138 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3953ffe-b583-483a-b3e4-8cb6393b09f7/glance-log/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.793816 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3526ee99-7b67-44b5-8cc1-0d8731e68758/glance-log/0.log" Mar 17 02:37:16 crc kubenswrapper[4755]: I0317 02:37:16.797555 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3526ee99-7b67-44b5-8cc1-0d8731e68758/glance-httpd/0.log" Mar 17 02:37:17 crc kubenswrapper[4755]: I0317 02:37:17.353389 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6dccf8ffb7-fvtwz_70e53650-f3d6-4ec4-9b49-bf34ec724c01/heat-engine/0.log" Mar 17 02:37:17 crc kubenswrapper[4755]: I0317 02:37:17.595512 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d5b659cb-h7mw4_18055ce1-2e32-41f8-8985-75bda9d75b01/horizon/0.log" Mar 17 02:37:17 crc kubenswrapper[4755]: I0317 02:37:17.832289 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d4zbz_768f1228-6ea3-4601-a0e4-93911d1d4fa1/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.086028 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bnr2r_654af424-4add-4b0f-97a6-896204b03483/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.201084 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5b57d6bfb7-dfq4n_a03991a5-be95-4757-a3d0-4ce2fff4fdf5/heat-api/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.240667 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-74f557fb5-t8sp4_59c5d35c-0a70-4965-b0b7-704028793d5e/heat-cfnapi/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.332681 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54d5b659cb-h7mw4_18055ce1-2e32-41f8-8985-75bda9d75b01/horizon-log/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.395283 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561821-f7mn2_fa97c87e-b133-48bb-af65-092be28ffca7/keystone-cron/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.540002 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29561881-fpzd5_0e541f43-cda6-4951-a0cd-f77cb49018fd/keystone-cron/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.734493 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_107b2153-2013-45e0-ad48-0f16e97d6d7e/kube-state-metrics/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.743878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-57d7fc6d98-smgzp_b8c11156-3bb6-45fb-aea6-c00316f50ef4/keystone-api/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.768907 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rjx85_e1b1ccf6-0ac6-4724-a948-b2a858b7cdf8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:18 crc kubenswrapper[4755]: I0317 02:37:18.967761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-c5btn_7dc38b61-4933-487d-a05c-8ade6cd59270/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.020506 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_763b47f1-98b0-4ebc-970c-adfcac1aee29/manila-api-log/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.145423 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_763b47f1-98b0-4ebc-970c-adfcac1aee29/manila-api/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.255588 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_725e1c02-2eca-44c3-8147-8976b9742412/probe/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.384561 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_725e1c02-2eca-44c3-8147-8976b9742412/manila-scheduler/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.623863 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_788939c2-92b3-482c-8271-08204a569e10/probe/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.661412 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_788939c2-92b3-482c-8271-08204a569e10/manila-share/0.log" Mar 17 02:37:19 crc kubenswrapper[4755]: I0317 02:37:19.789013 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_6fb69f1a-4500-4441-a103-843887d04772/mysqld-exporter/0.log" Mar 17 02:37:20 crc kubenswrapper[4755]: I0317 02:37:20.217691 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdj28_ed749590-0c9f-4ed1-876f-d6e28f1e98d2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:20 crc kubenswrapper[4755]: I0317 02:37:20.245551 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c4999fb9-bx48f_096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3/neutron-httpd/0.log" Mar 17 02:37:20 crc kubenswrapper[4755]: I0317 02:37:20.294101 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c4999fb9-bx48f_096dd9e7-1e02-4e3f-9bd3-f5842cb4bea3/neutron-api/0.log" Mar 17 02:37:20 crc kubenswrapper[4755]: I0317 02:37:20.941566 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_49f91949-f009-4181-94d6-c07e2c7cc7fc/nova-cell0-conductor-conductor/0.log" Mar 17 02:37:21 crc kubenswrapper[4755]: I0317 02:37:21.276856 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e7b5da46-21dd-4a2a-9a35-8b1f72e78ac8/nova-cell1-conductor-conductor/0.log" Mar 17 02:37:21 crc kubenswrapper[4755]: I0317 02:37:21.355852 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c316d4cb-fdc3-45e6-b679-14a04b2b32c1/nova-api-log/0.log" Mar 17 02:37:21 crc kubenswrapper[4755]: I0317 02:37:21.637904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kg58n_d0c84d8b-60dc-4e23-a4be-83b81c52f10f/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:21 crc kubenswrapper[4755]: I0317 02:37:21.688108 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_56f6f5b8-c52e-4fa6-be5b-12510ca9348d/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 02:37:22 crc kubenswrapper[4755]: I0317 02:37:22.027137 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6b2b187e-bb8e-4934-a004-532ea37d2cf2/nova-metadata-log/0.log" Mar 17 02:37:22 crc kubenswrapper[4755]: I0317 02:37:22.586382 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c316d4cb-fdc3-45e6-b679-14a04b2b32c1/nova-api-api/0.log" Mar 17 02:37:22 crc kubenswrapper[4755]: I0317 02:37:22.611750 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b40ded55-f4e9-48b7-b8a6-16cda16d1c09/nova-scheduler-scheduler/0.log" Mar 17 02:37:22 crc kubenswrapper[4755]: I0317 02:37:22.806242 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/mysql-bootstrap/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.122909 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/mysql-bootstrap/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.165612 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_dfea0511-e194-48c8-8795-58d07ada5d4c/galera/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.270153 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6b2b187e-bb8e-4934-a004-532ea37d2cf2/nova-metadata-metadata/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.390674 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/mysql-bootstrap/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.591366 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/mysql-bootstrap/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.608235 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e48be2ab-6e3e-4a75-b47e-e700bd4126f1/galera/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.662283 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_007a5062-42e0-47ac-9523-a4d486614f70/openstackclient/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.857635 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dvvpc_a6eae7bd-5007-4389-b4ab-7f296d0fa9ce/ovn-controller/0.log" Mar 17 02:37:23 crc kubenswrapper[4755]: I0317 02:37:23.907636 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lccfn_71d7e3dc-df60-416b-add1-b7f55fd74d2d/openstack-network-exporter/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.176455 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server-init/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.436531 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.473568 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovsdb-server-init/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.588168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdvbb_6baf9f03-ea25-4498-9999-2ae741ba0b3a/ovs-vswitchd/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.634242 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fx2nt_58680610-638b-4561-90d2-c13f1074a35b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.737459 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9c6a29f-013e-40dc-958a-05f36cb4e626/openstack-network-exporter/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.824849 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9c6a29f-013e-40dc-958a-05f36cb4e626/ovn-northd/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.959514 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_54c4fe64-c4f8-4e77-9029-946580816bf7/openstack-network-exporter/0.log" Mar 17 02:37:24 crc kubenswrapper[4755]: I0317 02:37:24.987963 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_54c4fe64-c4f8-4e77-9029-946580816bf7/ovsdbserver-nb/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.174892 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6e8532d-a845-4882-a690-09c072e39311/openstack-network-exporter/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.211763 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6e8532d-a845-4882-a690-09c072e39311/ovsdbserver-sb/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.571173 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f66764f8d-z7959_abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a/placement-api/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.576384 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/init-config-reloader/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.668708 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f66764f8d-z7959_abf5592f-34fd-41f6-bc9f-b4bdb8ceff4a/placement-log/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.714857 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/init-config-reloader/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.803231 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/config-reloader/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.851659 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/prometheus/0.log" Mar 17 02:37:25 crc kubenswrapper[4755]: I0317 02:37:25.928892 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_c1ebdcce-406b-4668-a325-f1f4318b2d69/thanos-sidecar/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.080821 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/setup-container/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.520997 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/rabbitmq/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.585354 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c98112b6-4653-4c2e-a16e-6ddbd29fe526/setup-container/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.617510 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/setup-container/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.883542 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/setup-container/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.948982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-q2h5w_522fd7b5-ad67-4bb9-815e-239ab63e78c9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:26 crc kubenswrapper[4755]: I0317 02:37:26.984512 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_232eeb12-0802-4513-83e2-66cc0b1b398b/rabbitmq/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.303416 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dkvkg_d3df52cf-6c5b-4e10-b055-d00d52e09156/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.304240 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r95fz_6f2a3043-b45c-43ea-a6fa-de300dee0390/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.489053 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dbdqx_96c8e866-f764-4b94-b980-7b007ba5411c/ssh-known-hosts-edpm-deployment/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.706904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9b5bb667-6pk7q_cfa93106-8e0c-4e7d-93cf-33d06c85d883/proxy-server/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.849381 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mh59s_9b280073-a793-4c35-a29b-d56ccf6037a7/swift-ring-rebalance/0.log" Mar 17 02:37:27 crc kubenswrapper[4755]: I0317 02:37:27.868352 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b9b5bb667-6pk7q_cfa93106-8e0c-4e7d-93cf-33d06c85d883/proxy-httpd/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.019823 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-auditor/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.104350 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-reaper/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.134085 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-replicator/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.253012 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-auditor/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.261333 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/account-server/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.346124 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-replicator/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.365121 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-server/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.478243 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/container-updater/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.571186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-auditor/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.577403 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-expirer/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.647701 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-replicator/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.665602 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.665656 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.665698 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.666564 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.666614 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" gracePeriod=600 Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.690815 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-server/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.760292 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/rsync/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: E0317 02:37:28.788268 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.789979 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/object-updater/0.log" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.800685 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" exitCode=0 Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.800723 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d"} Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.800753 4755 scope.go:117] "RemoveContainer" containerID="d7d9c4401707c4eaff6887df0c70850a0f00f0fc35a9833a320255ba98464548" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.801376 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:37:28 crc kubenswrapper[4755]: E0317 02:37:28.801647 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:37:28 crc kubenswrapper[4755]: I0317 02:37:28.935528 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_80ee6df5-abef-4094-aabc-45b15e1ebfcf/swift-recon-cron/0.log" Mar 17 02:37:29 crc kubenswrapper[4755]: I0317 02:37:29.101428 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-gdz8h_dc4c9ea3-e8b5-4aec-9bd9-d07e7b377109/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:29 crc kubenswrapper[4755]: I0317 02:37:29.220265 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-xmwrf_03684d66-3e86-4168-9a3a-62e40ba5ddce/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:29 crc kubenswrapper[4755]: I0317 02:37:29.446893 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b612dcdd-df72-4c24-827f-44d916531556/test-operator-logs-container/0.log" Mar 17 02:37:29 crc kubenswrapper[4755]: I0317 02:37:29.636307 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dtk7p_d9b4d7d9-daed-448e-b3a8-4f528207e319/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 02:37:30 crc kubenswrapper[4755]: I0317 02:37:30.261361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d2dadd71-3cf5-4c5b-a27d-d3f7523ca37d/tempest-tests-tempest-tests-runner/0.log" Mar 17 02:37:40 crc kubenswrapper[4755]: I0317 02:37:40.251091 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:37:40 crc kubenswrapper[4755]: E0317 02:37:40.251915 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:37:42 crc kubenswrapper[4755]: I0317 02:37:42.565654 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cf824bbf-6a94-4505-a9cb-67e9e394f2e1/memcached/0.log" Mar 17 02:37:54 crc kubenswrapper[4755]: I0317 02:37:54.248807 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:37:54 crc kubenswrapper[4755]: E0317 02:37:54.249754 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.159952 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561918-c5tpt"] Mar 17 02:38:00 crc kubenswrapper[4755]: E0317 02:38:00.161089 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5109517-910d-4f76-899d-2749000ed633" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.161106 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5109517-910d-4f76-899d-2749000ed633" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4755]: E0317 02:38:00.161150 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" containerName="container-00" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.161159 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" containerName="container-00" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.161417 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5109517-910d-4f76-899d-2749000ed633" containerName="oc" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.161471 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b3b1f0-b2ec-4893-8dd7-b93751c4d314" containerName="container-00" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.162468 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.166601 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.168061 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.168631 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.207485 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-c5tpt"] Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.236510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmw9f\" (UniqueName: \"kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f\") pod \"auto-csr-approver-29561918-c5tpt\" (UID: \"ba3a2142-4abd-4077-906f-eaf5b1707618\") " pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.339221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmw9f\" (UniqueName: \"kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f\") pod \"auto-csr-approver-29561918-c5tpt\" (UID: \"ba3a2142-4abd-4077-906f-eaf5b1707618\") " pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.363056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmw9f\" (UniqueName: \"kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f\") pod \"auto-csr-approver-29561918-c5tpt\" (UID: \"ba3a2142-4abd-4077-906f-eaf5b1707618\") " pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:00 crc kubenswrapper[4755]: I0317 02:38:00.513708 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:01 crc kubenswrapper[4755]: I0317 02:38:01.763251 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-c5tpt"] Mar 17 02:38:01 crc kubenswrapper[4755]: I0317 02:38:01.775864 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:38:02 crc kubenswrapper[4755]: I0317 02:38:02.181241 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" event={"ID":"ba3a2142-4abd-4077-906f-eaf5b1707618","Type":"ContainerStarted","Data":"2a8d5587b30f06b56d21820233579d5321de457f9c2bd22024cab3c52aa2ac6a"} Mar 17 02:38:02 crc kubenswrapper[4755]: I0317 02:38:02.609554 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.104865 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.128195 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.164466 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.189880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" event={"ID":"ba3a2142-4abd-4077-906f-eaf5b1707618","Type":"ContainerStarted","Data":"f8bd95add40326f4123ff28276618a1bcb5868b9303aee04c19d8a373ca006e4"} Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.207995 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" podStartSLOduration=2.236192413 podStartE2EDuration="3.207978438s" podCreationTimestamp="2026-03-17 02:38:00 +0000 UTC" firstStartedPulling="2026-03-17 02:38:01.774454846 +0000 UTC m=+8156.533907129" lastFinishedPulling="2026-03-17 02:38:02.746240871 +0000 UTC m=+8157.505693154" observedRunningTime="2026-03-17 02:38:03.20249274 +0000 UTC m=+8157.961945033" watchObservedRunningTime="2026-03-17 02:38:03.207978438 +0000 UTC m=+8157.967430721" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.284871 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/util/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.339739 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/pull/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.359810 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8139fc1d8a8a4e0dc9f03677e7fe03e6c617ad01ee8db207c77477336e4dx9q_27588406-a74c-454c-84be-38da41fe4737/extract/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.603475 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-r84f9_3adfd998-aade-4343-8952-50b0eba8b510/manager/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.808554 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-fmtx9_00c5d701-8e74-44a0-9880-257001cb0062/manager/0.log" Mar 17 02:38:03 crc kubenswrapper[4755]: I0317 02:38:03.998567 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-x7lgv_e0809421-a91c-42c6-af2f-c8dc2ae7e856/manager/0.log" Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.167485 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-ztp6j_13a9a76d-7b33-40eb-a7ec-5e5ff3c27705/manager/0.log" Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.199353 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba3a2142-4abd-4077-906f-eaf5b1707618" containerID="f8bd95add40326f4123ff28276618a1bcb5868b9303aee04c19d8a373ca006e4" exitCode=0 Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.199388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" event={"ID":"ba3a2142-4abd-4077-906f-eaf5b1707618","Type":"ContainerDied","Data":"f8bd95add40326f4123ff28276618a1bcb5868b9303aee04c19d8a373ca006e4"} Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.328651 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-8sd9w_a67ee100-af6d-492d-9a50-40fa8c59256b/manager/0.log" Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.681361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-zg9n4_9984519e-49f3-4af4-9c3b-d11af473a940/manager/0.log" Mar 17 02:38:04 crc kubenswrapper[4755]: I0317 02:38:04.908135 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-hx685_fea3f6b7-c840-4795-8ca2-9dba15a49df1/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.036549 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-wvrth_4fb668bf-a188-428e-b9cc-0f3ff55070fd/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.248768 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:38:05 crc kubenswrapper[4755]: E0317 02:38:05.249367 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.264978 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-kkr5x_ec169260-a79f-4a21-b78f-41fba2f8956e/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.436068 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-bcm7p_a9352105-fdd9-4cf9-b073-89a6eda036ab/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.610808 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-csd45_a95d67b8-819c-481e-9e68-87276454b88a/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.662694 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.750656 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmw9f\" (UniqueName: \"kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f\") pod \"ba3a2142-4abd-4077-906f-eaf5b1707618\" (UID: \"ba3a2142-4abd-4077-906f-eaf5b1707618\") " Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.761069 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f" (OuterVolumeSpecName: "kube-api-access-bmw9f") pod "ba3a2142-4abd-4077-906f-eaf5b1707618" (UID: "ba3a2142-4abd-4077-906f-eaf5b1707618"). InnerVolumeSpecName "kube-api-access-bmw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.853115 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmw9f\" (UniqueName: \"kubernetes.io/projected/ba3a2142-4abd-4077-906f-eaf5b1707618-kube-api-access-bmw9f\") on node \"crc\" DevicePath \"\"" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.936390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-8s5l4_3e3f09b9-2108-4341-9a51-6efee784ca0e/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.970676 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-8gwqk_0f81862c-c403-445b-8030-083e914d31a7/manager/0.log" Mar 17 02:38:05 crc kubenswrapper[4755]: I0317 02:38:05.996345 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-c9nzt_b5e695a0-9a52-46f0-8aae-3a4353bb3345/manager/0.log" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.149072 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-kvfkt_c2f21978-13ea-4441-ba13-2be2beec2f0a/manager/0.log" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.245357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" event={"ID":"ba3a2142-4abd-4077-906f-eaf5b1707618","Type":"ContainerDied","Data":"2a8d5587b30f06b56d21820233579d5321de457f9c2bd22024cab3c52aa2ac6a"} Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.245392 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8d5587b30f06b56d21820233579d5321de457f9c2bd22024cab3c52aa2ac6a" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.245451 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561918-c5tpt" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.278665 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-48pm5"] Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.293826 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561912-48pm5"] Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.296086 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-66cdd7cf4d-snfvp_ca5fc922-63bc-4052-844e-96e4a60e7ed4/operator/0.log" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.529006 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h4b7d_d22580ef-38c7-4b1a-95a3-c6a7507ba05a/registry-server/0.log" Mar 17 02:38:06 crc kubenswrapper[4755]: I0317 02:38:06.718145 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-bttcs_cea62bda-461f-4bb3-870b-51b767dd2585/manager/0.log" Mar 17 02:38:07 crc kubenswrapper[4755]: I0317 02:38:07.028797 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6pqsg_35b061e4-ec9c-46e3-828c-d787922370f9/manager/0.log" Mar 17 02:38:07 crc kubenswrapper[4755]: I0317 02:38:07.293889 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-blgfr_2d644d3f-351b-49ae-b1d6-c5ee0482ca29/operator/0.log" Mar 17 02:38:07 crc kubenswrapper[4755]: I0317 02:38:07.306982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-4jtxc_eb7e1883-c95a-4d25-894d-be127f5d4cf3/manager/0.log" Mar 17 02:38:07 crc kubenswrapper[4755]: I0317 02:38:07.742872 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d474745d9-7q6lg_eeea77af-84df-4778-8fe7-ddde0c1cda76/manager/0.log" Mar 17 02:38:07 crc kubenswrapper[4755]: I0317 02:38:07.971120 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-zlwf9_5d879770-dc2d-4c14-a5e1-c80879235d96/manager/0.log" Mar 17 02:38:08 crc kubenswrapper[4755]: I0317 02:38:08.026904 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-549b96fcbd-bklr6_f8d91ffa-2ac3-4935-95bf-45f6ac41e030/manager/0.log" Mar 17 02:38:08 crc kubenswrapper[4755]: I0317 02:38:08.126182 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-54tw8_599a37c9-2c1a-4b46-8cfd-1e8c5ea709a4/manager/0.log" Mar 17 02:38:08 crc kubenswrapper[4755]: I0317 02:38:08.260812 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04f4bda-ca2e-46f8-bdc5-249520e25aca" path="/var/lib/kubelet/pods/b04f4bda-ca2e-46f8-bdc5-249520e25aca/volumes" Mar 17 02:38:18 crc kubenswrapper[4755]: I0317 02:38:18.248155 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:38:18 crc kubenswrapper[4755]: E0317 02:38:18.248919 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:38:22 crc kubenswrapper[4755]: I0317 02:38:22.804450 4755 scope.go:117] "RemoveContainer" containerID="5a0ea8ce9306207b75d73f362a6cdd29242a2414219249cf20f776f201abadb5" Mar 17 02:38:30 crc kubenswrapper[4755]: I0317 02:38:30.354069 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dl89g_08e826b2-3275-49e6-b833-5494037aac5b/control-plane-machine-set-operator/0.log" Mar 17 02:38:30 crc kubenswrapper[4755]: I0317 02:38:30.512411 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8rqm6_9e12abed-0865-4f85-b563-ff72e5a05722/kube-rbac-proxy/0.log" Mar 17 02:38:30 crc kubenswrapper[4755]: I0317 02:38:30.575952 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8rqm6_9e12abed-0865-4f85-b563-ff72e5a05722/machine-api-operator/0.log" Mar 17 02:38:33 crc kubenswrapper[4755]: I0317 02:38:33.248451 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:38:33 crc kubenswrapper[4755]: E0317 02:38:33.249225 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:38:46 crc kubenswrapper[4755]: I0317 02:38:46.771029 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7fzpr_1d430575-aa06-4c37-8262-d01a1d1766b7/cert-manager-controller/0.log" Mar 17 02:38:47 crc kubenswrapper[4755]: I0317 02:38:47.032817 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-676tx_2d8af759-0406-4291-b488-291e4db0f5ff/cert-manager-cainjector/0.log" Mar 17 02:38:47 crc kubenswrapper[4755]: I0317 02:38:47.098206 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vtlkz_6725f6d6-96db-4e53-b4b7-b14e32c3160d/cert-manager-webhook/0.log" Mar 17 02:38:48 crc kubenswrapper[4755]: I0317 02:38:48.252966 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:38:48 crc kubenswrapper[4755]: E0317 02:38:48.253554 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:39:00 crc kubenswrapper[4755]: I0317 02:39:00.248927 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:39:00 crc kubenswrapper[4755]: E0317 02:39:00.249947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:39:02 crc kubenswrapper[4755]: I0317 02:39:02.969750 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-6f24p_33b5469d-e555-44f7-8f84-dcea89debdae/nmstate-console-plugin/0.log" Mar 17 02:39:03 crc kubenswrapper[4755]: I0317 02:39:03.124251 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvzsg_74f876f9-73dd-42eb-bc3c-8aa4e6dc854c/nmstate-handler/0.log" Mar 17 02:39:03 crc kubenswrapper[4755]: I0317 02:39:03.195392 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-5ml2z_1632175c-4118-4c14-b3ef-59472c846d04/kube-rbac-proxy/0.log" Mar 17 02:39:03 crc kubenswrapper[4755]: I0317 02:39:03.257329 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-5ml2z_1632175c-4118-4c14-b3ef-59472c846d04/nmstate-metrics/0.log" Mar 17 02:39:03 crc kubenswrapper[4755]: I0317 02:39:03.391597 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-qkhgq_8905f140-6bfd-4be5-89dd-3db46bdcc933/nmstate-operator/0.log" Mar 17 02:39:03 crc kubenswrapper[4755]: I0317 02:39:03.499223 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-q892t_ae6a47ef-9bd5-4c94-8ae3-966b11c8506b/nmstate-webhook/0.log" Mar 17 02:39:13 crc kubenswrapper[4755]: I0317 02:39:13.249260 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:39:13 crc kubenswrapper[4755]: E0317 02:39:13.250130 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:39:18 crc kubenswrapper[4755]: I0317 02:39:18.614790 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/kube-rbac-proxy/0.log" Mar 17 02:39:18 crc kubenswrapper[4755]: I0317 02:39:18.679054 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/manager/0.log" Mar 17 02:39:24 crc kubenswrapper[4755]: I0317 02:39:24.248397 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:39:24 crc kubenswrapper[4755]: E0317 02:39:24.249225 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.162270 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-bvjlr_26acf5e2-72ee-4d4c-b25b-9d641f0a42df/prometheus-operator/0.log" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.296092 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_9d5530d5-e196-42d5-b0b9-c089b13d97a8/prometheus-operator-admission-webhook/0.log" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.375222 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1/prometheus-operator-admission-webhook/0.log" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.495405 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4w5t6_24b6289e-88b6-4958-9ce1-539cecddbd1f/operator/0.log" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.580679 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-2vdc8_96eaea54-65d7-475c-8d91-45ba95bd547a/observability-ui-dashboards/0.log" Mar 17 02:39:33 crc kubenswrapper[4755]: I0317 02:39:33.703764 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-czwsf_70c01555-4d7f-426f-a9a5-fd21462252dc/perses-operator/0.log" Mar 17 02:39:38 crc kubenswrapper[4755]: I0317 02:39:38.248215 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:39:38 crc kubenswrapper[4755]: E0317 02:39:38.249197 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:39:49 crc kubenswrapper[4755]: I0317 02:39:49.793897 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-vb7pt_c40a3b37-e723-4031-be06-728785655b37/cluster-logging-operator/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.053153 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-tqf58_50d6f059-2e1c-4ac4-9952-dcbab62b23db/collector/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.105382 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_66514db5-2205-445e-b424-b55fb9910be3/loki-compactor/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.206815 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-crzs4_c413d841-c2b9-4757-bbe4-ebd965553d29/loki-distributor/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.308700 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-f2g7b_ad7de1cc-717a-4e3e-81f7-43c677c2db13/gateway/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.364032 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-f2g7b_ad7de1cc-717a-4e3e-81f7-43c677c2db13/opa/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.466275 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-jfm48_c1fe9206-5f28-4707-b175-12ba0fadb400/gateway/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.542566 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cdf4b6b4d-jfm48_c1fe9206-5f28-4707-b175-12ba0fadb400/opa/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.557558 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_93847e12-81c9-4ae4-8090-e7df4bd5f9a7/loki-index-gateway/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.780147 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c516523b-4c3b-4083-a8f5-18c9061c7032/loki-ingester/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.800044 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-bh66s_85ac7711-fd0b-4598-93dd-6c591a532bac/loki-querier/0.log" Mar 17 02:39:50 crc kubenswrapper[4755]: I0317 02:39:50.928662 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-95f86_296e81ca-7bf1-44f2-b1a8-bfb13a563134/loki-query-frontend/0.log" Mar 17 02:39:52 crc kubenswrapper[4755]: I0317 02:39:52.247915 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:39:52 crc kubenswrapper[4755]: E0317 02:39:52.248178 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.147051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561920-r2bcw"] Mar 17 02:40:00 crc kubenswrapper[4755]: E0317 02:40:00.148900 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3a2142-4abd-4077-906f-eaf5b1707618" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.148926 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3a2142-4abd-4077-906f-eaf5b1707618" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.149246 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3a2142-4abd-4077-906f-eaf5b1707618" containerName="oc" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.150470 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.152435 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.153006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.153069 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.159554 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-r2bcw"] Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.320990 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nkw\" (UniqueName: \"kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw\") pod \"auto-csr-approver-29561920-r2bcw\" (UID: \"c74e61c4-d0a5-4394-a8c7-c2f68174056f\") " pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.424583 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nkw\" (UniqueName: \"kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw\") pod \"auto-csr-approver-29561920-r2bcw\" (UID: \"c74e61c4-d0a5-4394-a8c7-c2f68174056f\") " pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.455118 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nkw\" (UniqueName: \"kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw\") pod \"auto-csr-approver-29561920-r2bcw\" (UID: \"c74e61c4-d0a5-4394-a8c7-c2f68174056f\") " pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:00 crc kubenswrapper[4755]: I0317 02:40:00.483753 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:01 crc kubenswrapper[4755]: I0317 02:40:01.159771 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-r2bcw"] Mar 17 02:40:02 crc kubenswrapper[4755]: I0317 02:40:02.135357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" event={"ID":"c74e61c4-d0a5-4394-a8c7-c2f68174056f","Type":"ContainerStarted","Data":"987a6f674b3d9f211398deecee4e4da33f563018d24a09011cbc89d188439f14"} Mar 17 02:40:03 crc kubenswrapper[4755]: I0317 02:40:03.150125 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" event={"ID":"c74e61c4-d0a5-4394-a8c7-c2f68174056f","Type":"ContainerStarted","Data":"7f47ea6e82d9e727aafd24f95611fb487215f3bcdfef25135b8cee21509b55aa"} Mar 17 02:40:03 crc kubenswrapper[4755]: I0317 02:40:03.171747 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" podStartSLOduration=2.021148469 podStartE2EDuration="3.171719397s" podCreationTimestamp="2026-03-17 02:40:00 +0000 UTC" firstStartedPulling="2026-03-17 02:40:01.181363224 +0000 UTC m=+8275.940815517" lastFinishedPulling="2026-03-17 02:40:02.331934152 +0000 UTC m=+8277.091386445" observedRunningTime="2026-03-17 02:40:03.165322765 +0000 UTC m=+8277.924775058" watchObservedRunningTime="2026-03-17 02:40:03.171719397 +0000 UTC m=+8277.931171710" Mar 17 02:40:04 crc kubenswrapper[4755]: I0317 02:40:04.165478 4755 generic.go:334] "Generic (PLEG): container finished" podID="c74e61c4-d0a5-4394-a8c7-c2f68174056f" containerID="7f47ea6e82d9e727aafd24f95611fb487215f3bcdfef25135b8cee21509b55aa" exitCode=0 Mar 17 02:40:04 crc kubenswrapper[4755]: I0317 02:40:04.165624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" event={"ID":"c74e61c4-d0a5-4394-a8c7-c2f68174056f","Type":"ContainerDied","Data":"7f47ea6e82d9e727aafd24f95611fb487215f3bcdfef25135b8cee21509b55aa"} Mar 17 02:40:05 crc kubenswrapper[4755]: I0317 02:40:05.640155 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:05 crc kubenswrapper[4755]: I0317 02:40:05.743697 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8nkw\" (UniqueName: \"kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw\") pod \"c74e61c4-d0a5-4394-a8c7-c2f68174056f\" (UID: \"c74e61c4-d0a5-4394-a8c7-c2f68174056f\") " Mar 17 02:40:05 crc kubenswrapper[4755]: I0317 02:40:05.755789 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw" (OuterVolumeSpecName: "kube-api-access-k8nkw") pod "c74e61c4-d0a5-4394-a8c7-c2f68174056f" (UID: "c74e61c4-d0a5-4394-a8c7-c2f68174056f"). InnerVolumeSpecName "kube-api-access-k8nkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:40:05 crc kubenswrapper[4755]: I0317 02:40:05.846776 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8nkw\" (UniqueName: \"kubernetes.io/projected/c74e61c4-d0a5-4394-a8c7-c2f68174056f-kube-api-access-k8nkw\") on node \"crc\" DevicePath \"\"" Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.191949 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" event={"ID":"c74e61c4-d0a5-4394-a8c7-c2f68174056f","Type":"ContainerDied","Data":"987a6f674b3d9f211398deecee4e4da33f563018d24a09011cbc89d188439f14"} Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.191987 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987a6f674b3d9f211398deecee4e4da33f563018d24a09011cbc89d188439f14" Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.192043 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561920-r2bcw" Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.299471 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-pj7hp"] Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.299510 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561914-pj7hp"] Mar 17 02:40:06 crc kubenswrapper[4755]: I0317 02:40:06.989738 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hfls9_c2bae47d-2436-490a-8998-6d1f1c59ff6d/kube-rbac-proxy/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.037367 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hfls9_c2bae47d-2436-490a-8998-6d1f1c59ff6d/controller/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.186810 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.248158 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:40:07 crc kubenswrapper[4755]: E0317 02:40:07.248486 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.408298 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.409737 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.475770 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.490661 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.641650 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.664703 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.671589 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.682521 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.823606 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-frr-files/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.827617 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-metrics/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.899004 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/cp-reloader/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.914035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/controller/0.log" Mar 17 02:40:07 crc kubenswrapper[4755]: I0317 02:40:07.987237 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/frr-metrics/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.067077 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/kube-rbac-proxy/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.172640 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/kube-rbac-proxy-frr/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.195771 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/reloader/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.262359 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d86c2427-5a6a-418d-891c-3fb21f54b8c5" path="/var/lib/kubelet/pods/d86c2427-5a6a-418d-891c-3fb21f54b8c5/volumes" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.365091 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gv4fl_7a47b75a-e6b5-493f-9ec6-8843b2724a32/frr-k8s-webhook-server/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.523008 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7fc56c47d5-cmtzx_4d52b0c9-7534-41b2-b8a7-0f02ef08e1c9/manager/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.706798 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-578dbc6b-286wk_356ac706-4ec4-49b8-b270-6c8fa35b7d72/webhook-server/0.log" Mar 17 02:40:08 crc kubenswrapper[4755]: I0317 02:40:08.820932 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ct56p_07bf4fdd-648b-425f-8b00-7ad303c2b77f/kube-rbac-proxy/0.log" Mar 17 02:40:09 crc kubenswrapper[4755]: I0317 02:40:09.532578 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ct56p_07bf4fdd-648b-425f-8b00-7ad303c2b77f/speaker/0.log" Mar 17 02:40:10 crc kubenswrapper[4755]: I0317 02:40:10.673635 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-58c5n_5b9e844c-82cd-4654-9fc4-cf5eb901df30/frr/0.log" Mar 17 02:40:18 crc kubenswrapper[4755]: I0317 02:40:18.248730 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:40:18 crc kubenswrapper[4755]: E0317 02:40:18.249854 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:22 crc kubenswrapper[4755]: I0317 02:40:22.949198 4755 scope.go:117] "RemoveContainer" containerID="914629b879981692b45abfa7ea4a2400c4277408ada9f2172d34a62eb8d2bdf8" Mar 17 02:40:23 crc kubenswrapper[4755]: I0317 02:40:23.960739 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.168015 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.193145 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.267686 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.391607 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/util/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.409451 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.410203 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746xh22_53a160ea-9625-4d20-82ab-cae78c0c4911/extract/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.567005 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.755623 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.787058 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.806876 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.965277 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/pull/0.log" Mar 17 02:40:24 crc kubenswrapper[4755]: I0317 02:40:24.996320 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/extract/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.006758 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ftwwk_92740d40-a460-41c9-9f94-eaac2999c3f7/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.162990 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.337079 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.338712 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.340817 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.504201 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.505206 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/pull/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.534797 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5pnltt_fb6ad00e-5b17-4cb6-898f-278fc16e8f31/extract/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.690558 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.863073 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.868785 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:40:25 crc kubenswrapper[4755]: I0317 02:40:25.890749 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.072730 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/pull/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.102744 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/util/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.119208 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cff8wd_44b52216-5e23-4dd1-8a3a-32973449c58c/extract/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.259342 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.460534 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.482762 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.484989 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.663761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/util/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.695411 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/pull/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.740770 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lzz6c_f6050c97-e228-485e-9b2e-e04588fff1aa/extract/0.log" Mar 17 02:40:26 crc kubenswrapper[4755]: I0317 02:40:26.854543 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.009594 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.031463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.071516 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.293256 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-content/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.350814 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.556407 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.695327 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-54fgj_77550708-f4d6-4bd2-901a-bad2b1813e2b/registry-server/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.700478 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.737097 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.797161 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.858449 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-utilities/0.log" Mar 17 02:40:27 crc kubenswrapper[4755]: I0317 02:40:27.882761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.018123 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mvsj8_277ca8b8-67f5-4fdb-ad34-648ad653fa5d/marketplace-operator/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.081324 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.240658 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4xgxq_30c7719a-2835-48a7-a0d7-6fc05b2f0e99/registry-server/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.290721 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.322937 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.348974 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.472295 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.474752 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/extract-utilities/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.578759 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.727883 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p727d_ac12f55c-136b-4cf3-aae6-dca7f5353189/registry-server/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.774487 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.778360 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.782933 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:40:28 crc kubenswrapper[4755]: I0317 02:40:28.984972 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-utilities/0.log" Mar 17 02:40:29 crc kubenswrapper[4755]: I0317 02:40:29.050246 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/extract-content/0.log" Mar 17 02:40:29 crc kubenswrapper[4755]: I0317 02:40:29.617723 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-chf29_1add56c9-cfd8-4fa3-b532-e4b952f36683/registry-server/0.log" Mar 17 02:40:30 crc kubenswrapper[4755]: I0317 02:40:30.248263 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:40:30 crc kubenswrapper[4755]: E0317 02:40:30.248700 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.249005 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:40:43 crc kubenswrapper[4755]: E0317 02:40:43.249885 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.548516 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-bvjlr_26acf5e2-72ee-4d4c-b25b-9d641f0a42df/prometheus-operator/0.log" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.591684 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-5xmqs_9d5530d5-e196-42d5-b0b9-c089b13d97a8/prometheus-operator-admission-webhook/0.log" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.612697 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-645f745c74-l8tqk_a24f3f6d-abc7-4fc8-b0c1-609a0fdb55c1/prometheus-operator-admission-webhook/0.log" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.789741 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-2vdc8_96eaea54-65d7-475c-8d91-45ba95bd547a/observability-ui-dashboards/0.log" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.791177 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-czwsf_70c01555-4d7f-426f-a9a5-fd21462252dc/perses-operator/0.log" Mar 17 02:40:43 crc kubenswrapper[4755]: I0317 02:40:43.804726 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4w5t6_24b6289e-88b6-4958-9ce1-539cecddbd1f/operator/0.log" Mar 17 02:40:54 crc kubenswrapper[4755]: I0317 02:40:54.248375 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:40:54 crc kubenswrapper[4755]: E0317 02:40:54.249527 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:40:57 crc kubenswrapper[4755]: I0317 02:40:57.953857 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/kube-rbac-proxy/0.log" Mar 17 02:40:58 crc kubenswrapper[4755]: I0317 02:40:58.066514 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7c87b9bff5-zjj4w_9f91fd3f-25cb-4a6d-b57e-5d1f85be7b1a/manager/0.log" Mar 17 02:41:07 crc kubenswrapper[4755]: I0317 02:41:07.248670 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:41:07 crc kubenswrapper[4755]: E0317 02:41:07.249473 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.793723 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:10 crc kubenswrapper[4755]: E0317 02:41:10.795314 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74e61c4-d0a5-4394-a8c7-c2f68174056f" containerName="oc" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.795333 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74e61c4-d0a5-4394-a8c7-c2f68174056f" containerName="oc" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.795937 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74e61c4-d0a5-4394-a8c7-c2f68174056f" containerName="oc" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.828310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.849656 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.862566 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.862705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.862782 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt97v\" (UniqueName: \"kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.964409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt97v\" (UniqueName: \"kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.964716 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.964881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.966265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.969692 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:10 crc kubenswrapper[4755]: I0317 02:41:10.989841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt97v\" (UniqueName: \"kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v\") pod \"redhat-marketplace-dwlqn\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:11 crc kubenswrapper[4755]: I0317 02:41:11.164667 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:11 crc kubenswrapper[4755]: I0317 02:41:11.851674 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:11 crc kubenswrapper[4755]: I0317 02:41:11.951203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerStarted","Data":"51ccad026aa564865ff6f90d926531132fa6308a1ab61499f0005c3fbe8c79f0"} Mar 17 02:41:12 crc kubenswrapper[4755]: I0317 02:41:12.962845 4755 generic.go:334] "Generic (PLEG): container finished" podID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerID="24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934" exitCode=0 Mar 17 02:41:12 crc kubenswrapper[4755]: I0317 02:41:12.963256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerDied","Data":"24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934"} Mar 17 02:41:13 crc kubenswrapper[4755]: I0317 02:41:13.973576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerStarted","Data":"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd"} Mar 17 02:41:14 crc kubenswrapper[4755]: I0317 02:41:14.983227 4755 generic.go:334] "Generic (PLEG): container finished" podID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerID="9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd" exitCode=0 Mar 17 02:41:14 crc kubenswrapper[4755]: I0317 02:41:14.983379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerDied","Data":"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd"} Mar 17 02:41:16 crc kubenswrapper[4755]: I0317 02:41:16.000380 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerStarted","Data":"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2"} Mar 17 02:41:16 crc kubenswrapper[4755]: I0317 02:41:16.026826 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwlqn" podStartSLOduration=3.50538699 podStartE2EDuration="6.026810581s" podCreationTimestamp="2026-03-17 02:41:10 +0000 UTC" firstStartedPulling="2026-03-17 02:41:12.966943975 +0000 UTC m=+8347.726396258" lastFinishedPulling="2026-03-17 02:41:15.488367566 +0000 UTC m=+8350.247819849" observedRunningTime="2026-03-17 02:41:16.014966732 +0000 UTC m=+8350.774419015" watchObservedRunningTime="2026-03-17 02:41:16.026810581 +0000 UTC m=+8350.786262854" Mar 17 02:41:16 crc kubenswrapper[4755]: E0317 02:41:16.250696 4755 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.32:50066->38.102.83.32:36119: write tcp 38.102.83.32:50066->38.102.83.32:36119: write: broken pipe Mar 17 02:41:19 crc kubenswrapper[4755]: I0317 02:41:19.249716 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:41:19 crc kubenswrapper[4755]: E0317 02:41:19.251712 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:41:21 crc kubenswrapper[4755]: I0317 02:41:21.164924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:21 crc kubenswrapper[4755]: I0317 02:41:21.166100 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:21 crc kubenswrapper[4755]: I0317 02:41:21.221879 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:22 crc kubenswrapper[4755]: I0317 02:41:22.125364 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:22 crc kubenswrapper[4755]: I0317 02:41:22.178246 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.085693 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwlqn" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="registry-server" containerID="cri-o://d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2" gracePeriod=2 Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.743803 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.915679 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt97v\" (UniqueName: \"kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v\") pod \"7c20eeb4-c756-423b-bf25-9e989e5e746c\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.916693 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content\") pod \"7c20eeb4-c756-423b-bf25-9e989e5e746c\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.916825 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities\") pod \"7c20eeb4-c756-423b-bf25-9e989e5e746c\" (UID: \"7c20eeb4-c756-423b-bf25-9e989e5e746c\") " Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.917514 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities" (OuterVolumeSpecName: "utilities") pod "7c20eeb4-c756-423b-bf25-9e989e5e746c" (UID: "7c20eeb4-c756-423b-bf25-9e989e5e746c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.917896 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.933043 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v" (OuterVolumeSpecName: "kube-api-access-lt97v") pod "7c20eeb4-c756-423b-bf25-9e989e5e746c" (UID: "7c20eeb4-c756-423b-bf25-9e989e5e746c"). InnerVolumeSpecName "kube-api-access-lt97v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:41:24 crc kubenswrapper[4755]: I0317 02:41:24.979859 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c20eeb4-c756-423b-bf25-9e989e5e746c" (UID: "7c20eeb4-c756-423b-bf25-9e989e5e746c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.021225 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt97v\" (UniqueName: \"kubernetes.io/projected/7c20eeb4-c756-423b-bf25-9e989e5e746c-kube-api-access-lt97v\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.021282 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c20eeb4-c756-423b-bf25-9e989e5e746c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.103662 4755 generic.go:334] "Generic (PLEG): container finished" podID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerID="d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2" exitCode=0 Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.103730 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerDied","Data":"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2"} Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.103768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlqn" event={"ID":"7c20eeb4-c756-423b-bf25-9e989e5e746c","Type":"ContainerDied","Data":"51ccad026aa564865ff6f90d926531132fa6308a1ab61499f0005c3fbe8c79f0"} Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.103792 4755 scope.go:117] "RemoveContainer" containerID="d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.103982 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlqn" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.153936 4755 scope.go:117] "RemoveContainer" containerID="9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.162776 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.186466 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlqn"] Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.210742 4755 scope.go:117] "RemoveContainer" containerID="24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.263784 4755 scope.go:117] "RemoveContainer" containerID="d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2" Mar 17 02:41:25 crc kubenswrapper[4755]: E0317 02:41:25.265149 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2\": container with ID starting with d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2 not found: ID does not exist" containerID="d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.265186 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2"} err="failed to get container status \"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2\": rpc error: code = NotFound desc = could not find container \"d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2\": container with ID starting with d7c5031145d48ad7e3377c4bd8329f3b9f68a0e404a78c931ecea138bcc5f0c2 not found: ID does not exist" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.265210 4755 scope.go:117] "RemoveContainer" containerID="9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd" Mar 17 02:41:25 crc kubenswrapper[4755]: E0317 02:41:25.265772 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd\": container with ID starting with 9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd not found: ID does not exist" containerID="9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.265799 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd"} err="failed to get container status \"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd\": rpc error: code = NotFound desc = could not find container \"9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd\": container with ID starting with 9a36412a27b2ddefe272d3a961d3c6af85d7384a0454a2a290f61b0ba4fd5acd not found: ID does not exist" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.265813 4755 scope.go:117] "RemoveContainer" containerID="24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934" Mar 17 02:41:25 crc kubenswrapper[4755]: E0317 02:41:25.266024 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934\": container with ID starting with 24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934 not found: ID does not exist" containerID="24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934" Mar 17 02:41:25 crc kubenswrapper[4755]: I0317 02:41:25.266049 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934"} err="failed to get container status \"24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934\": rpc error: code = NotFound desc = could not find container \"24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934\": container with ID starting with 24c5ba00f1ec785954f2a42d9ce8abcd20ed2e25b5a7c8b0b7bb8f639c2eb934 not found: ID does not exist" Mar 17 02:41:26 crc kubenswrapper[4755]: I0317 02:41:26.278736 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" path="/var/lib/kubelet/pods/7c20eeb4-c756-423b-bf25-9e989e5e746c/volumes" Mar 17 02:41:32 crc kubenswrapper[4755]: I0317 02:41:32.249973 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:41:32 crc kubenswrapper[4755]: E0317 02:41:32.250956 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:41:48 crc kubenswrapper[4755]: I0317 02:41:48.249188 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:41:48 crc kubenswrapper[4755]: E0317 02:41:48.250105 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.169162 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561922-nhvmb"] Mar 17 02:42:00 crc kubenswrapper[4755]: E0317 02:42:00.170533 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="extract-content" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.170555 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="extract-content" Mar 17 02:42:00 crc kubenswrapper[4755]: E0317 02:42:00.170571 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="extract-utilities" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.170583 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="extract-utilities" Mar 17 02:42:00 crc kubenswrapper[4755]: E0317 02:42:00.170640 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.170653 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.171017 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c20eeb4-c756-423b-bf25-9e989e5e746c" containerName="registry-server" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.172548 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.175512 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.175699 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.178523 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.185474 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-nhvmb"] Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.290979 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwx2\" (UniqueName: \"kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2\") pod \"auto-csr-approver-29561922-nhvmb\" (UID: \"66a000db-1924-4816-a78c-cd801e89ab54\") " pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.393197 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzwx2\" (UniqueName: \"kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2\") pod \"auto-csr-approver-29561922-nhvmb\" (UID: \"66a000db-1924-4816-a78c-cd801e89ab54\") " pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.420903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzwx2\" (UniqueName: \"kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2\") pod \"auto-csr-approver-29561922-nhvmb\" (UID: \"66a000db-1924-4816-a78c-cd801e89ab54\") " pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:00 crc kubenswrapper[4755]: I0317 02:42:00.529758 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:01 crc kubenswrapper[4755]: I0317 02:42:01.047162 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561922-nhvmb"] Mar 17 02:42:01 crc kubenswrapper[4755]: I0317 02:42:01.591576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" event={"ID":"66a000db-1924-4816-a78c-cd801e89ab54","Type":"ContainerStarted","Data":"cb02b597276ed8a174d7d8820e0b940cb73137510cfeda53a167028e3d0399d6"} Mar 17 02:42:03 crc kubenswrapper[4755]: I0317 02:42:03.248655 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:42:03 crc kubenswrapper[4755]: E0317 02:42:03.249223 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:42:03 crc kubenswrapper[4755]: I0317 02:42:03.623591 4755 generic.go:334] "Generic (PLEG): container finished" podID="66a000db-1924-4816-a78c-cd801e89ab54" containerID="e214e09ab664778244192f8ad5c1d852a353f936fec97a8c4ba8cf42de4ae713" exitCode=0 Mar 17 02:42:03 crc kubenswrapper[4755]: I0317 02:42:03.623803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" event={"ID":"66a000db-1924-4816-a78c-cd801e89ab54","Type":"ContainerDied","Data":"e214e09ab664778244192f8ad5c1d852a353f936fec97a8c4ba8cf42de4ae713"} Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.127267 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.216143 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzwx2\" (UniqueName: \"kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2\") pod \"66a000db-1924-4816-a78c-cd801e89ab54\" (UID: \"66a000db-1924-4816-a78c-cd801e89ab54\") " Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.224988 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2" (OuterVolumeSpecName: "kube-api-access-pzwx2") pod "66a000db-1924-4816-a78c-cd801e89ab54" (UID: "66a000db-1924-4816-a78c-cd801e89ab54"). InnerVolumeSpecName "kube-api-access-pzwx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.318984 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzwx2\" (UniqueName: \"kubernetes.io/projected/66a000db-1924-4816-a78c-cd801e89ab54-kube-api-access-pzwx2\") on node \"crc\" DevicePath \"\"" Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.649553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" event={"ID":"66a000db-1924-4816-a78c-cd801e89ab54","Type":"ContainerDied","Data":"cb02b597276ed8a174d7d8820e0b940cb73137510cfeda53a167028e3d0399d6"} Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.649863 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb02b597276ed8a174d7d8820e0b940cb73137510cfeda53a167028e3d0399d6" Mar 17 02:42:05 crc kubenswrapper[4755]: I0317 02:42:05.649637 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561922-nhvmb" Mar 17 02:42:06 crc kubenswrapper[4755]: I0317 02:42:06.217053 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-7qb2v"] Mar 17 02:42:06 crc kubenswrapper[4755]: I0317 02:42:06.229323 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561916-7qb2v"] Mar 17 02:42:06 crc kubenswrapper[4755]: I0317 02:42:06.270977 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5109517-910d-4f76-899d-2749000ed633" path="/var/lib/kubelet/pods/f5109517-910d-4f76-899d-2749000ed633/volumes" Mar 17 02:42:14 crc kubenswrapper[4755]: I0317 02:42:14.249126 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:42:14 crc kubenswrapper[4755]: E0317 02:42:14.249820 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:42:23 crc kubenswrapper[4755]: I0317 02:42:23.073352 4755 scope.go:117] "RemoveContainer" containerID="39e6aaab3cfd985876d65e000bb307cf4d2cfc95e2fb0fd822cc699a2832d1d0" Mar 17 02:42:28 crc kubenswrapper[4755]: I0317 02:42:28.252176 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:42:28 crc kubenswrapper[4755]: E0317 02:42:28.252930 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bhh2x_openshift-machine-config-operator(2de863ac-0be1-45c8-9e03-56aa0fe9a23d)\"" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.061558 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:37 crc kubenswrapper[4755]: E0317 02:42:37.062919 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a000db-1924-4816-a78c-cd801e89ab54" containerName="oc" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.062937 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a000db-1924-4816-a78c-cd801e89ab54" containerName="oc" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.063407 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a000db-1924-4816-a78c-cd801e89ab54" containerName="oc" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.065426 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.075296 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.075354 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.075688 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qtd2\" (UniqueName: \"kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.084169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.176554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qtd2\" (UniqueName: \"kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.176649 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.176678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.177325 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.177572 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.200117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qtd2\" (UniqueName: \"kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2\") pod \"community-operators-lvnjj\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.387479 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:37 crc kubenswrapper[4755]: I0317 02:42:37.951144 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:38 crc kubenswrapper[4755]: I0317 02:42:38.193574 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerStarted","Data":"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20"} Mar 17 02:42:38 crc kubenswrapper[4755]: I0317 02:42:38.193885 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerStarted","Data":"9885d4bf8901f155e044e055a88dae717c3b6996f91f0697014931f09299decc"} Mar 17 02:42:39 crc kubenswrapper[4755]: I0317 02:42:39.204110 4755 generic.go:334] "Generic (PLEG): container finished" podID="e10c1f27-e219-42bd-afba-088e303df08a" containerID="f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20" exitCode=0 Mar 17 02:42:39 crc kubenswrapper[4755]: I0317 02:42:39.204153 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerDied","Data":"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20"} Mar 17 02:42:40 crc kubenswrapper[4755]: I0317 02:42:40.249262 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:42:41 crc kubenswrapper[4755]: I0317 02:42:41.227970 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"03cd9e2c8bd05e274a5a543edf9c7fe26cd5d64050000f43dd1cfb6968e42b17"} Mar 17 02:42:41 crc kubenswrapper[4755]: I0317 02:42:41.233962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerStarted","Data":"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262"} Mar 17 02:42:43 crc kubenswrapper[4755]: I0317 02:42:43.360701 4755 generic.go:334] "Generic (PLEG): container finished" podID="e10c1f27-e219-42bd-afba-088e303df08a" containerID="94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262" exitCode=0 Mar 17 02:42:43 crc kubenswrapper[4755]: I0317 02:42:43.361155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerDied","Data":"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262"} Mar 17 02:42:44 crc kubenswrapper[4755]: I0317 02:42:44.382305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerStarted","Data":"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8"} Mar 17 02:42:44 crc kubenswrapper[4755]: I0317 02:42:44.415193 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvnjj" podStartSLOduration=2.727839883 podStartE2EDuration="7.415156153s" podCreationTimestamp="2026-03-17 02:42:37 +0000 UTC" firstStartedPulling="2026-03-17 02:42:39.207640351 +0000 UTC m=+8433.967092634" lastFinishedPulling="2026-03-17 02:42:43.894956591 +0000 UTC m=+8438.654408904" observedRunningTime="2026-03-17 02:42:44.399285346 +0000 UTC m=+8439.158737669" watchObservedRunningTime="2026-03-17 02:42:44.415156153 +0000 UTC m=+8439.174608486" Mar 17 02:42:47 crc kubenswrapper[4755]: I0317 02:42:47.388629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:47 crc kubenswrapper[4755]: I0317 02:42:47.389074 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:48 crc kubenswrapper[4755]: I0317 02:42:48.465262 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lvnjj" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="registry-server" probeResult="failure" output=< Mar 17 02:42:48 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:42:48 crc kubenswrapper[4755]: > Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.457159 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.460011 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.481862 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.563793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.563859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxckw\" (UniqueName: \"kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.564066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.666308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxckw\" (UniqueName: \"kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.666507 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.666590 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.667005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.667504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.694653 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxckw\" (UniqueName: \"kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw\") pod \"redhat-operators-jvvmx\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:51 crc kubenswrapper[4755]: I0317 02:42:51.780673 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:42:52 crc kubenswrapper[4755]: I0317 02:42:52.269551 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:42:52 crc kubenswrapper[4755]: I0317 02:42:52.484088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerStarted","Data":"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647"} Mar 17 02:42:52 crc kubenswrapper[4755]: I0317 02:42:52.484461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerStarted","Data":"96dd8ebc4344690124bd4bb302dbcc6f5c71166ea64e0fe68037cfa2ca58c0f5"} Mar 17 02:42:53 crc kubenswrapper[4755]: I0317 02:42:53.507000 4755 generic.go:334] "Generic (PLEG): container finished" podID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerID="19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647" exitCode=0 Mar 17 02:42:53 crc kubenswrapper[4755]: I0317 02:42:53.507286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerDied","Data":"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647"} Mar 17 02:42:54 crc kubenswrapper[4755]: I0317 02:42:54.526515 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerStarted","Data":"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c"} Mar 17 02:42:57 crc kubenswrapper[4755]: I0317 02:42:57.468927 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:57 crc kubenswrapper[4755]: I0317 02:42:57.531680 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:57 crc kubenswrapper[4755]: I0317 02:42:57.829368 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:58 crc kubenswrapper[4755]: I0317 02:42:58.582508 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvnjj" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="registry-server" containerID="cri-o://4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8" gracePeriod=2 Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.378116 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.453097 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qtd2\" (UniqueName: \"kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2\") pod \"e10c1f27-e219-42bd-afba-088e303df08a\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.453187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities\") pod \"e10c1f27-e219-42bd-afba-088e303df08a\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.453334 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content\") pod \"e10c1f27-e219-42bd-afba-088e303df08a\" (UID: \"e10c1f27-e219-42bd-afba-088e303df08a\") " Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.454494 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities" (OuterVolumeSpecName: "utilities") pod "e10c1f27-e219-42bd-afba-088e303df08a" (UID: "e10c1f27-e219-42bd-afba-088e303df08a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.465543 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2" (OuterVolumeSpecName: "kube-api-access-8qtd2") pod "e10c1f27-e219-42bd-afba-088e303df08a" (UID: "e10c1f27-e219-42bd-afba-088e303df08a"). InnerVolumeSpecName "kube-api-access-8qtd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.523746 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e10c1f27-e219-42bd-afba-088e303df08a" (UID: "e10c1f27-e219-42bd-afba-088e303df08a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.561121 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qtd2\" (UniqueName: \"kubernetes.io/projected/e10c1f27-e219-42bd-afba-088e303df08a-kube-api-access-8qtd2\") on node \"crc\" DevicePath \"\"" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.561163 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.561176 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e10c1f27-e219-42bd-afba-088e303df08a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.596520 4755 generic.go:334] "Generic (PLEG): container finished" podID="e10c1f27-e219-42bd-afba-088e303df08a" containerID="4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8" exitCode=0 Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.596566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerDied","Data":"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8"} Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.596586 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvnjj" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.596600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvnjj" event={"ID":"e10c1f27-e219-42bd-afba-088e303df08a","Type":"ContainerDied","Data":"9885d4bf8901f155e044e055a88dae717c3b6996f91f0697014931f09299decc"} Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.596619 4755 scope.go:117] "RemoveContainer" containerID="4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.632260 4755 scope.go:117] "RemoveContainer" containerID="94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.639527 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.652052 4755 scope.go:117] "RemoveContainer" containerID="f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.674738 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvnjj"] Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.708651 4755 scope.go:117] "RemoveContainer" containerID="4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8" Mar 17 02:42:59 crc kubenswrapper[4755]: E0317 02:42:59.709143 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8\": container with ID starting with 4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8 not found: ID does not exist" containerID="4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.709190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8"} err="failed to get container status \"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8\": rpc error: code = NotFound desc = could not find container \"4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8\": container with ID starting with 4b99088eefd10db862698380d7a5e3613b02a21d72c5cf67d67e361ac97d51e8 not found: ID does not exist" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.709217 4755 scope.go:117] "RemoveContainer" containerID="94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262" Mar 17 02:42:59 crc kubenswrapper[4755]: E0317 02:42:59.709659 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262\": container with ID starting with 94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262 not found: ID does not exist" containerID="94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.709705 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262"} err="failed to get container status \"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262\": rpc error: code = NotFound desc = could not find container \"94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262\": container with ID starting with 94f67b85ce069a7a23e0752a940530581159b9b53cb60ad651d7b0cf78702262 not found: ID does not exist" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.709736 4755 scope.go:117] "RemoveContainer" containerID="f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20" Mar 17 02:42:59 crc kubenswrapper[4755]: E0317 02:42:59.710126 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20\": container with ID starting with f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20 not found: ID does not exist" containerID="f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20" Mar 17 02:42:59 crc kubenswrapper[4755]: I0317 02:42:59.710149 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20"} err="failed to get container status \"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20\": rpc error: code = NotFound desc = could not find container \"f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20\": container with ID starting with f37922572d997671ca332918aa1f0ad15f7079a1c512737c8fcd4c5d9ebdba20 not found: ID does not exist" Mar 17 02:43:00 crc kubenswrapper[4755]: I0317 02:43:00.265428 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10c1f27-e219-42bd-afba-088e303df08a" path="/var/lib/kubelet/pods/e10c1f27-e219-42bd-afba-088e303df08a/volumes" Mar 17 02:43:00 crc kubenswrapper[4755]: I0317 02:43:00.614659 4755 generic.go:334] "Generic (PLEG): container finished" podID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerID="1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c" exitCode=0 Mar 17 02:43:00 crc kubenswrapper[4755]: I0317 02:43:00.614743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerDied","Data":"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c"} Mar 17 02:43:01 crc kubenswrapper[4755]: I0317 02:43:01.635419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerStarted","Data":"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617"} Mar 17 02:43:01 crc kubenswrapper[4755]: I0317 02:43:01.667377 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvvmx" podStartSLOduration=3.050799383 podStartE2EDuration="10.667352434s" podCreationTimestamp="2026-03-17 02:42:51 +0000 UTC" firstStartedPulling="2026-03-17 02:42:53.509352016 +0000 UTC m=+8448.268804299" lastFinishedPulling="2026-03-17 02:43:01.125905067 +0000 UTC m=+8455.885357350" observedRunningTime="2026-03-17 02:43:01.656543272 +0000 UTC m=+8456.415995595" watchObservedRunningTime="2026-03-17 02:43:01.667352434 +0000 UTC m=+8456.426804737" Mar 17 02:43:01 crc kubenswrapper[4755]: I0317 02:43:01.780858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:01 crc kubenswrapper[4755]: I0317 02:43:01.780944 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:02 crc kubenswrapper[4755]: I0317 02:43:02.875070 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvvmx" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" probeResult="failure" output=< Mar 17 02:43:02 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:43:02 crc kubenswrapper[4755]: > Mar 17 02:43:07 crc kubenswrapper[4755]: I0317 02:43:07.699146 4755 generic.go:334] "Generic (PLEG): container finished" podID="17133d6e-4462-478a-a0d9-7be42c11c490" containerID="35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06" exitCode=0 Mar 17 02:43:07 crc kubenswrapper[4755]: I0317 02:43:07.699195 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" event={"ID":"17133d6e-4462-478a-a0d9-7be42c11c490","Type":"ContainerDied","Data":"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06"} Mar 17 02:43:07 crc kubenswrapper[4755]: I0317 02:43:07.699970 4755 scope.go:117] "RemoveContainer" containerID="35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06" Mar 17 02:43:07 crc kubenswrapper[4755]: I0317 02:43:07.943086 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4ksgn_must-gather-mgjmq_17133d6e-4462-478a-a0d9-7be42c11c490/gather/0.log" Mar 17 02:43:12 crc kubenswrapper[4755]: I0317 02:43:12.848832 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvvmx" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" probeResult="failure" output=< Mar 17 02:43:12 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:43:12 crc kubenswrapper[4755]: > Mar 17 02:43:22 crc kubenswrapper[4755]: I0317 02:43:22.831850 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvvmx" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" probeResult="failure" output=< Mar 17 02:43:22 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:43:22 crc kubenswrapper[4755]: > Mar 17 02:43:24 crc kubenswrapper[4755]: I0317 02:43:24.929364 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4ksgn/must-gather-mgjmq"] Mar 17 02:43:24 crc kubenswrapper[4755]: I0317 02:43:24.930777 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="copy" containerID="cri-o://749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79" gracePeriod=2 Mar 17 02:43:24 crc kubenswrapper[4755]: I0317 02:43:24.947013 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4ksgn/must-gather-mgjmq"] Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.712053 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4ksgn_must-gather-mgjmq_17133d6e-4462-478a-a0d9-7be42c11c490/copy/0.log" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.713303 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.849000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwm8q\" (UniqueName: \"kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q\") pod \"17133d6e-4462-478a-a0d9-7be42c11c490\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.849065 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output\") pod \"17133d6e-4462-478a-a0d9-7be42c11c490\" (UID: \"17133d6e-4462-478a-a0d9-7be42c11c490\") " Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.855301 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q" (OuterVolumeSpecName: "kube-api-access-cwm8q") pod "17133d6e-4462-478a-a0d9-7be42c11c490" (UID: "17133d6e-4462-478a-a0d9-7be42c11c490"). InnerVolumeSpecName "kube-api-access-cwm8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.904051 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4ksgn_must-gather-mgjmq_17133d6e-4462-478a-a0d9-7be42c11c490/copy/0.log" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.904392 4755 generic.go:334] "Generic (PLEG): container finished" podID="17133d6e-4462-478a-a0d9-7be42c11c490" containerID="749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79" exitCode=143 Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.904456 4755 scope.go:117] "RemoveContainer" containerID="749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.904580 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4ksgn/must-gather-mgjmq" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.928097 4755 scope.go:117] "RemoveContainer" containerID="35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.952504 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwm8q\" (UniqueName: \"kubernetes.io/projected/17133d6e-4462-478a-a0d9-7be42c11c490-kube-api-access-cwm8q\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.989401 4755 scope.go:117] "RemoveContainer" containerID="749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79" Mar 17 02:43:25 crc kubenswrapper[4755]: E0317 02:43:25.990297 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79\": container with ID starting with 749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79 not found: ID does not exist" containerID="749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.990331 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79"} err="failed to get container status \"749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79\": rpc error: code = NotFound desc = could not find container \"749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79\": container with ID starting with 749fe593a88d46c123f10658aa07ad773bb716beffaedffd889c093fd9d81d79 not found: ID does not exist" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.990361 4755 scope.go:117] "RemoveContainer" containerID="35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06" Mar 17 02:43:25 crc kubenswrapper[4755]: E0317 02:43:25.991555 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06\": container with ID starting with 35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06 not found: ID does not exist" containerID="35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06" Mar 17 02:43:25 crc kubenswrapper[4755]: I0317 02:43:25.991579 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06"} err="failed to get container status \"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06\": rpc error: code = NotFound desc = could not find container \"35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06\": container with ID starting with 35e6db7febbabb1242443666d05f762e456247461125f4fc50ca7024a5f1ce06 not found: ID does not exist" Mar 17 02:43:26 crc kubenswrapper[4755]: I0317 02:43:26.079104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "17133d6e-4462-478a-a0d9-7be42c11c490" (UID: "17133d6e-4462-478a-a0d9-7be42c11c490"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:43:26 crc kubenswrapper[4755]: I0317 02:43:26.157144 4755 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17133d6e-4462-478a-a0d9-7be42c11c490-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:26 crc kubenswrapper[4755]: I0317 02:43:26.258925 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" path="/var/lib/kubelet/pods/17133d6e-4462-478a-a0d9-7be42c11c490/volumes" Mar 17 02:43:32 crc kubenswrapper[4755]: I0317 02:43:32.838479 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvvmx" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" probeResult="failure" output=< Mar 17 02:43:32 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 17 02:43:32 crc kubenswrapper[4755]: > Mar 17 02:43:41 crc kubenswrapper[4755]: I0317 02:43:41.855827 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:41 crc kubenswrapper[4755]: I0317 02:43:41.932844 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:42 crc kubenswrapper[4755]: I0317 02:43:42.112961 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.174029 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvvmx" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" containerID="cri-o://0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617" gracePeriod=2 Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.735373 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.872948 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content\") pod \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.873104 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxckw\" (UniqueName: \"kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw\") pod \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.873384 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities\") pod \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\" (UID: \"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0\") " Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.873993 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities" (OuterVolumeSpecName: "utilities") pod "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" (UID: "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.874735 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.881760 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw" (OuterVolumeSpecName: "kube-api-access-zxckw") pod "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" (UID: "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0"). InnerVolumeSpecName "kube-api-access-zxckw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.976012 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxckw\" (UniqueName: \"kubernetes.io/projected/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-kube-api-access-zxckw\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:43 crc kubenswrapper[4755]: I0317 02:43:43.994041 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" (UID: "3092f303-ca8a-4ceb-a0a8-c2a616f47fc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.079315 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.187357 4755 generic.go:334] "Generic (PLEG): container finished" podID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerID="0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617" exitCode=0 Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.187404 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerDied","Data":"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617"} Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.187433 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvmx" event={"ID":"3092f303-ca8a-4ceb-a0a8-c2a616f47fc0","Type":"ContainerDied","Data":"96dd8ebc4344690124bd4bb302dbcc6f5c71166ea64e0fe68037cfa2ca58c0f5"} Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.187459 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvmx" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.187470 4755 scope.go:117] "RemoveContainer" containerID="0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.219740 4755 scope.go:117] "RemoveContainer" containerID="1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.240039 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.244801 4755 scope.go:117] "RemoveContainer" containerID="19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.299330 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvvmx"] Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.313143 4755 scope.go:117] "RemoveContainer" containerID="0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617" Mar 17 02:43:44 crc kubenswrapper[4755]: E0317 02:43:44.313690 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617\": container with ID starting with 0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617 not found: ID does not exist" containerID="0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.313746 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617"} err="failed to get container status \"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617\": rpc error: code = NotFound desc = could not find container \"0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617\": container with ID starting with 0fe82d2dcab3ae61d67f313470060c57c0e136111f5fcc32e125a08448b77617 not found: ID does not exist" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.313778 4755 scope.go:117] "RemoveContainer" containerID="1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c" Mar 17 02:43:44 crc kubenswrapper[4755]: E0317 02:43:44.314150 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c\": container with ID starting with 1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c not found: ID does not exist" containerID="1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.314189 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c"} err="failed to get container status \"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c\": rpc error: code = NotFound desc = could not find container \"1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c\": container with ID starting with 1a2f94d86691487e2ced3750fd2ea63325f5988d2457ce7211de91229c87189c not found: ID does not exist" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.314215 4755 scope.go:117] "RemoveContainer" containerID="19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647" Mar 17 02:43:44 crc kubenswrapper[4755]: E0317 02:43:44.314485 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647\": container with ID starting with 19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647 not found: ID does not exist" containerID="19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647" Mar 17 02:43:44 crc kubenswrapper[4755]: I0317 02:43:44.314542 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647"} err="failed to get container status \"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647\": rpc error: code = NotFound desc = could not find container \"19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647\": container with ID starting with 19632d7808df51b19c5a2e48e782edd0e38cf99721aeaf490b31c8e548a0a647 not found: ID does not exist" Mar 17 02:43:46 crc kubenswrapper[4755]: I0317 02:43:46.275476 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" path="/var/lib/kubelet/pods/3092f303-ca8a-4ceb-a0a8-c2a616f47fc0/volumes" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.177849 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561924-sdrw2"] Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.178986 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="extract-content" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179002 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="extract-content" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179022 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="extract-utilities" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179029 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="extract-utilities" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179045 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179054 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179066 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="extract-utilities" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179073 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="extract-utilities" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179090 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179098 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179116 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="copy" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179124 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="copy" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179139 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="gather" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179147 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="gather" Mar 17 02:44:00 crc kubenswrapper[4755]: E0317 02:44:00.179171 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="extract-content" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179179 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="extract-content" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179485 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3092f303-ca8a-4ceb-a0a8-c2a616f47fc0" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179509 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10c1f27-e219-42bd-afba-088e303df08a" containerName="registry-server" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179532 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="copy" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.179551 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="17133d6e-4462-478a-a0d9-7be42c11c490" containerName="gather" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.180504 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.183085 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.184630 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.184859 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.192054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g564h\" (UniqueName: \"kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h\") pod \"auto-csr-approver-29561924-sdrw2\" (UID: \"99c2104b-ed1b-4b64-ad98-87fd4d3217f5\") " pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.202451 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-sdrw2"] Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.294869 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g564h\" (UniqueName: \"kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h\") pod \"auto-csr-approver-29561924-sdrw2\" (UID: \"99c2104b-ed1b-4b64-ad98-87fd4d3217f5\") " pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.337119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g564h\" (UniqueName: \"kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h\") pod \"auto-csr-approver-29561924-sdrw2\" (UID: \"99c2104b-ed1b-4b64-ad98-87fd4d3217f5\") " pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:00 crc kubenswrapper[4755]: I0317 02:44:00.502826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:01 crc kubenswrapper[4755]: I0317 02:44:01.032870 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 02:44:01 crc kubenswrapper[4755]: I0317 02:44:01.036985 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561924-sdrw2"] Mar 17 02:44:01 crc kubenswrapper[4755]: I0317 02:44:01.396608 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" event={"ID":"99c2104b-ed1b-4b64-ad98-87fd4d3217f5","Type":"ContainerStarted","Data":"ddc4f0055aa5ca5a6fd42107e531badc1f288de6113884bda67bf0f1fd5a0c80"} Mar 17 02:44:03 crc kubenswrapper[4755]: I0317 02:44:03.429113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" event={"ID":"99c2104b-ed1b-4b64-ad98-87fd4d3217f5","Type":"ContainerStarted","Data":"77f5e94f73ffd14e3259fadce01c8e3c4eef747a04a8ec2f2fd106d43a874824"} Mar 17 02:44:03 crc kubenswrapper[4755]: I0317 02:44:03.465781 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" podStartSLOduration=2.613428472 podStartE2EDuration="3.465753394s" podCreationTimestamp="2026-03-17 02:44:00 +0000 UTC" firstStartedPulling="2026-03-17 02:44:01.031631309 +0000 UTC m=+8515.791083632" lastFinishedPulling="2026-03-17 02:44:01.883956271 +0000 UTC m=+8516.643408554" observedRunningTime="2026-03-17 02:44:03.443858644 +0000 UTC m=+8518.203310937" watchObservedRunningTime="2026-03-17 02:44:03.465753394 +0000 UTC m=+8518.225205707" Mar 17 02:44:04 crc kubenswrapper[4755]: I0317 02:44:04.445380 4755 generic.go:334] "Generic (PLEG): container finished" podID="99c2104b-ed1b-4b64-ad98-87fd4d3217f5" containerID="77f5e94f73ffd14e3259fadce01c8e3c4eef747a04a8ec2f2fd106d43a874824" exitCode=0 Mar 17 02:44:04 crc kubenswrapper[4755]: I0317 02:44:04.445614 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" event={"ID":"99c2104b-ed1b-4b64-ad98-87fd4d3217f5","Type":"ContainerDied","Data":"77f5e94f73ffd14e3259fadce01c8e3c4eef747a04a8ec2f2fd106d43a874824"} Mar 17 02:44:05 crc kubenswrapper[4755]: I0317 02:44:05.994724 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.141246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g564h\" (UniqueName: \"kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h\") pod \"99c2104b-ed1b-4b64-ad98-87fd4d3217f5\" (UID: \"99c2104b-ed1b-4b64-ad98-87fd4d3217f5\") " Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.147797 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h" (OuterVolumeSpecName: "kube-api-access-g564h") pod "99c2104b-ed1b-4b64-ad98-87fd4d3217f5" (UID: "99c2104b-ed1b-4b64-ad98-87fd4d3217f5"). InnerVolumeSpecName "kube-api-access-g564h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.244019 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g564h\" (UniqueName: \"kubernetes.io/projected/99c2104b-ed1b-4b64-ad98-87fd4d3217f5-kube-api-access-g564h\") on node \"crc\" DevicePath \"\"" Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.549184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" event={"ID":"99c2104b-ed1b-4b64-ad98-87fd4d3217f5","Type":"ContainerDied","Data":"ddc4f0055aa5ca5a6fd42107e531badc1f288de6113884bda67bf0f1fd5a0c80"} Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.549488 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc4f0055aa5ca5a6fd42107e531badc1f288de6113884bda67bf0f1fd5a0c80" Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.549640 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561924-sdrw2" Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.589585 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-c5tpt"] Mar 17 02:44:06 crc kubenswrapper[4755]: I0317 02:44:06.596504 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561918-c5tpt"] Mar 17 02:44:08 crc kubenswrapper[4755]: I0317 02:44:08.270551 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3a2142-4abd-4077-906f-eaf5b1707618" path="/var/lib/kubelet/pods/ba3a2142-4abd-4077-906f-eaf5b1707618/volumes" Mar 17 02:44:23 crc kubenswrapper[4755]: I0317 02:44:23.273196 4755 scope.go:117] "RemoveContainer" containerID="f8bd95add40326f4123ff28276618a1bcb5868b9303aee04c19d8a373ca006e4" Mar 17 02:44:58 crc kubenswrapper[4755]: I0317 02:44:58.665080 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:44:58 crc kubenswrapper[4755]: I0317 02:44:58.665870 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.155724 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm"] Mar 17 02:45:00 crc kubenswrapper[4755]: E0317 02:45:00.156808 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c2104b-ed1b-4b64-ad98-87fd4d3217f5" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.156840 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c2104b-ed1b-4b64-ad98-87fd4d3217f5" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.157269 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c2104b-ed1b-4b64-ad98-87fd4d3217f5" containerName="oc" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.158649 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.161416 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.161632 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.165246 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.165487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.165612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.170662 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm"] Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.267502 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.267630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.267706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.270066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.276800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.286654 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g\") pod \"collect-profiles-29561925-xmrnm\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:00 crc kubenswrapper[4755]: I0317 02:45:00.487667 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:01 crc kubenswrapper[4755]: I0317 02:45:01.037333 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm"] Mar 17 02:45:01 crc kubenswrapper[4755]: I0317 02:45:01.323293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" event={"ID":"e2acacb0-149d-4243-8b1c-3a74e09604e2","Type":"ContainerStarted","Data":"f243d4b2b78fe7db41bb7dd38382f53ade742eeb52231d25ea819459bed36d05"} Mar 17 02:45:01 crc kubenswrapper[4755]: I0317 02:45:01.323744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" event={"ID":"e2acacb0-149d-4243-8b1c-3a74e09604e2","Type":"ContainerStarted","Data":"5c63d44c2fdaca2d9f0660d604979368311dc245b2e7da936b48857b3015d860"} Mar 17 02:45:01 crc kubenswrapper[4755]: I0317 02:45:01.353806 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" podStartSLOduration=1.3537844319999999 podStartE2EDuration="1.353784432s" podCreationTimestamp="2026-03-17 02:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 02:45:01.3410788 +0000 UTC m=+8576.100531163" watchObservedRunningTime="2026-03-17 02:45:01.353784432 +0000 UTC m=+8576.113236725" Mar 17 02:45:02 crc kubenswrapper[4755]: I0317 02:45:02.339373 4755 generic.go:334] "Generic (PLEG): container finished" podID="e2acacb0-149d-4243-8b1c-3a74e09604e2" containerID="f243d4b2b78fe7db41bb7dd38382f53ade742eeb52231d25ea819459bed36d05" exitCode=0 Mar 17 02:45:02 crc kubenswrapper[4755]: I0317 02:45:02.339467 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" event={"ID":"e2acacb0-149d-4243-8b1c-3a74e09604e2","Type":"ContainerDied","Data":"f243d4b2b78fe7db41bb7dd38382f53ade742eeb52231d25ea819459bed36d05"} Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.857867 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.961824 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g\") pod \"e2acacb0-149d-4243-8b1c-3a74e09604e2\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.962069 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume\") pod \"e2acacb0-149d-4243-8b1c-3a74e09604e2\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.962116 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume\") pod \"e2acacb0-149d-4243-8b1c-3a74e09604e2\" (UID: \"e2acacb0-149d-4243-8b1c-3a74e09604e2\") " Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.963323 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2acacb0-149d-4243-8b1c-3a74e09604e2" (UID: "e2acacb0-149d-4243-8b1c-3a74e09604e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.969652 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g" (OuterVolumeSpecName: "kube-api-access-vdk8g") pod "e2acacb0-149d-4243-8b1c-3a74e09604e2" (UID: "e2acacb0-149d-4243-8b1c-3a74e09604e2"). InnerVolumeSpecName "kube-api-access-vdk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:45:03 crc kubenswrapper[4755]: I0317 02:45:03.969814 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2acacb0-149d-4243-8b1c-3a74e09604e2" (UID: "e2acacb0-149d-4243-8b1c-3a74e09604e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.066792 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdk8g\" (UniqueName: \"kubernetes.io/projected/e2acacb0-149d-4243-8b1c-3a74e09604e2-kube-api-access-vdk8g\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.066888 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2acacb0-149d-4243-8b1c-3a74e09604e2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.066946 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2acacb0-149d-4243-8b1c-3a74e09604e2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.374802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" event={"ID":"e2acacb0-149d-4243-8b1c-3a74e09604e2","Type":"ContainerDied","Data":"5c63d44c2fdaca2d9f0660d604979368311dc245b2e7da936b48857b3015d860"} Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.375278 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c63d44c2fdaca2d9f0660d604979368311dc245b2e7da936b48857b3015d860" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.374877 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29561925-xmrnm" Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.468206 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l"] Mar 17 02:45:04 crc kubenswrapper[4755]: I0317 02:45:04.483065 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29561880-v4w6l"] Mar 17 02:45:06 crc kubenswrapper[4755]: I0317 02:45:06.326837 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c179a95-a9d8-47f1-8710-e596da586065" path="/var/lib/kubelet/pods/4c179a95-a9d8-47f1-8710-e596da586065/volumes" Mar 17 02:45:23 crc kubenswrapper[4755]: I0317 02:45:23.424582 4755 scope.go:117] "RemoveContainer" containerID="bc8482eb39a4fc35d1e51ee84f6325e003c59e92a0a12c3b1b98eef959b90bf2" Mar 17 02:45:28 crc kubenswrapper[4755]: I0317 02:45:28.665206 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:45:28 crc kubenswrapper[4755]: I0317 02:45:28.666484 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.388015 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:42 crc kubenswrapper[4755]: E0317 02:45:42.388989 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2acacb0-149d-4243-8b1c-3a74e09604e2" containerName="collect-profiles" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.389003 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2acacb0-149d-4243-8b1c-3a74e09604e2" containerName="collect-profiles" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.389269 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2acacb0-149d-4243-8b1c-3a74e09604e2" containerName="collect-profiles" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.391217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.404849 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.483543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c7b\" (UniqueName: \"kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.483633 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.483814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.586789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.587171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c7b\" (UniqueName: \"kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.587238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.587811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.587829 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.614776 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c7b\" (UniqueName: \"kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b\") pod \"certified-operators-p67d7\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:42 crc kubenswrapper[4755]: I0317 02:45:42.741853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:43 crc kubenswrapper[4755]: I0317 02:45:43.274323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:43 crc kubenswrapper[4755]: I0317 02:45:43.915293 4755 generic.go:334] "Generic (PLEG): container finished" podID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerID="4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a" exitCode=0 Mar 17 02:45:43 crc kubenswrapper[4755]: I0317 02:45:43.915369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerDied","Data":"4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a"} Mar 17 02:45:43 crc kubenswrapper[4755]: I0317 02:45:43.915896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerStarted","Data":"1b6e8da132d2277b014a8ff93c637b830930af9b20c83755ac09cb37bab108f0"} Mar 17 02:45:44 crc kubenswrapper[4755]: I0317 02:45:44.932234 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerStarted","Data":"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05"} Mar 17 02:45:46 crc kubenswrapper[4755]: I0317 02:45:46.955883 4755 generic.go:334] "Generic (PLEG): container finished" podID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerID="865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05" exitCode=0 Mar 17 02:45:46 crc kubenswrapper[4755]: I0317 02:45:46.956016 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerDied","Data":"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05"} Mar 17 02:45:47 crc kubenswrapper[4755]: I0317 02:45:47.974266 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerStarted","Data":"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a"} Mar 17 02:45:48 crc kubenswrapper[4755]: I0317 02:45:48.000984 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p67d7" podStartSLOduration=2.454746601 podStartE2EDuration="6.000960658s" podCreationTimestamp="2026-03-17 02:45:42 +0000 UTC" firstStartedPulling="2026-03-17 02:45:43.923521191 +0000 UTC m=+8618.682973474" lastFinishedPulling="2026-03-17 02:45:47.469735238 +0000 UTC m=+8622.229187531" observedRunningTime="2026-03-17 02:45:47.997549697 +0000 UTC m=+8622.757002020" watchObservedRunningTime="2026-03-17 02:45:48.000960658 +0000 UTC m=+8622.760412981" Mar 17 02:45:52 crc kubenswrapper[4755]: I0317 02:45:52.742045 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:52 crc kubenswrapper[4755]: I0317 02:45:52.742655 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:52 crc kubenswrapper[4755]: I0317 02:45:52.808878 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:53 crc kubenswrapper[4755]: I0317 02:45:53.123478 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:53 crc kubenswrapper[4755]: I0317 02:45:53.194576 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.066597 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p67d7" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="registry-server" containerID="cri-o://c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a" gracePeriod=2 Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.650974 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.742946 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities\") pod \"65284db5-5e0a-49f3-b8b3-4f138b95633c\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.743355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2c7b\" (UniqueName: \"kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b\") pod \"65284db5-5e0a-49f3-b8b3-4f138b95633c\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.744358 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content\") pod \"65284db5-5e0a-49f3-b8b3-4f138b95633c\" (UID: \"65284db5-5e0a-49f3-b8b3-4f138b95633c\") " Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.744681 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities" (OuterVolumeSpecName: "utilities") pod "65284db5-5e0a-49f3-b8b3-4f138b95633c" (UID: "65284db5-5e0a-49f3-b8b3-4f138b95633c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.745214 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.754019 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b" (OuterVolumeSpecName: "kube-api-access-g2c7b") pod "65284db5-5e0a-49f3-b8b3-4f138b95633c" (UID: "65284db5-5e0a-49f3-b8b3-4f138b95633c"). InnerVolumeSpecName "kube-api-access-g2c7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.817532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65284db5-5e0a-49f3-b8b3-4f138b95633c" (UID: "65284db5-5e0a-49f3-b8b3-4f138b95633c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.847290 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65284db5-5e0a-49f3-b8b3-4f138b95633c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:55 crc kubenswrapper[4755]: I0317 02:45:55.847323 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2c7b\" (UniqueName: \"kubernetes.io/projected/65284db5-5e0a-49f3-b8b3-4f138b95633c-kube-api-access-g2c7b\") on node \"crc\" DevicePath \"\"" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.088198 4755 generic.go:334] "Generic (PLEG): container finished" podID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerID="c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a" exitCode=0 Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.088277 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p67d7" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.088315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerDied","Data":"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a"} Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.088942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p67d7" event={"ID":"65284db5-5e0a-49f3-b8b3-4f138b95633c","Type":"ContainerDied","Data":"1b6e8da132d2277b014a8ff93c637b830930af9b20c83755ac09cb37bab108f0"} Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.088997 4755 scope.go:117] "RemoveContainer" containerID="c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.124304 4755 scope.go:117] "RemoveContainer" containerID="865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.156677 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.175560 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p67d7"] Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.181683 4755 scope.go:117] "RemoveContainer" containerID="4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.245898 4755 scope.go:117] "RemoveContainer" containerID="c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a" Mar 17 02:45:56 crc kubenswrapper[4755]: E0317 02:45:56.246504 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a\": container with ID starting with c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a not found: ID does not exist" containerID="c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.246566 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a"} err="failed to get container status \"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a\": rpc error: code = NotFound desc = could not find container \"c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a\": container with ID starting with c0cb4ee60b9efd18cef61ddf1f444070136a34710d7cfe62a1777e160c45830a not found: ID does not exist" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.246611 4755 scope.go:117] "RemoveContainer" containerID="865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05" Mar 17 02:45:56 crc kubenswrapper[4755]: E0317 02:45:56.246947 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05\": container with ID starting with 865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05 not found: ID does not exist" containerID="865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.246998 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05"} err="failed to get container status \"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05\": rpc error: code = NotFound desc = could not find container \"865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05\": container with ID starting with 865d3c49863b29a1ce8f887370011970eb8e64cddd0d391093e9205a50701d05 not found: ID does not exist" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.247027 4755 scope.go:117] "RemoveContainer" containerID="4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a" Mar 17 02:45:56 crc kubenswrapper[4755]: E0317 02:45:56.247511 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a\": container with ID starting with 4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a not found: ID does not exist" containerID="4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.247700 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a"} err="failed to get container status \"4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a\": rpc error: code = NotFound desc = could not find container \"4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a\": container with ID starting with 4f43b2cf05cf46de717bc8a787e92a5e8cdbc808b40fe43512c60b0150208e0a not found: ID does not exist" Mar 17 02:45:56 crc kubenswrapper[4755]: I0317 02:45:56.268073 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" path="/var/lib/kubelet/pods/65284db5-5e0a-49f3-b8b3-4f138b95633c/volumes" Mar 17 02:45:58 crc kubenswrapper[4755]: I0317 02:45:58.665231 4755 patch_prober.go:28] interesting pod/machine-config-daemon-bhh2x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 02:45:58 crc kubenswrapper[4755]: I0317 02:45:58.665640 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 02:45:58 crc kubenswrapper[4755]: I0317 02:45:58.665703 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" Mar 17 02:45:58 crc kubenswrapper[4755]: I0317 02:45:58.666878 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03cd9e2c8bd05e274a5a543edf9c7fe26cd5d64050000f43dd1cfb6968e42b17"} pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 02:45:58 crc kubenswrapper[4755]: I0317 02:45:58.666957 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" podUID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerName="machine-config-daemon" containerID="cri-o://03cd9e2c8bd05e274a5a543edf9c7fe26cd5d64050000f43dd1cfb6968e42b17" gracePeriod=600 Mar 17 02:45:59 crc kubenswrapper[4755]: I0317 02:45:59.146851 4755 generic.go:334] "Generic (PLEG): container finished" podID="2de863ac-0be1-45c8-9e03-56aa0fe9a23d" containerID="03cd9e2c8bd05e274a5a543edf9c7fe26cd5d64050000f43dd1cfb6968e42b17" exitCode=0 Mar 17 02:45:59 crc kubenswrapper[4755]: I0317 02:45:59.146916 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerDied","Data":"03cd9e2c8bd05e274a5a543edf9c7fe26cd5d64050000f43dd1cfb6968e42b17"} Mar 17 02:45:59 crc kubenswrapper[4755]: I0317 02:45:59.147334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bhh2x" event={"ID":"2de863ac-0be1-45c8-9e03-56aa0fe9a23d","Type":"ContainerStarted","Data":"36ff1c3dcde9aac4da2d97e10ac77aa4da16da3d229cb073f488d6aa6e09258a"} Mar 17 02:45:59 crc kubenswrapper[4755]: I0317 02:45:59.147369 4755 scope.go:117] "RemoveContainer" containerID="de99a264a68cff37473fdd6d417c1c5398c5eb233ef05a57fd073cdd3c55df7d" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.156809 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29561926-ttwt7"] Mar 17 02:46:00 crc kubenswrapper[4755]: E0317 02:46:00.158194 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="extract-content" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.158220 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="extract-content" Mar 17 02:46:00 crc kubenswrapper[4755]: E0317 02:46:00.158280 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="registry-server" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.158293 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="registry-server" Mar 17 02:46:00 crc kubenswrapper[4755]: E0317 02:46:00.158317 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="extract-utilities" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.158330 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="extract-utilities" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.158757 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="65284db5-5e0a-49f3-b8b3-4f138b95633c" containerName="registry-server" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.159819 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.164483 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.165157 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.170050 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pp9vb" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.181225 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-ttwt7"] Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.258708 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmn4q\" (UniqueName: \"kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q\") pod \"auto-csr-approver-29561926-ttwt7\" (UID: \"bf33e85c-0b73-443f-a5b7-36ce4922d62a\") " pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.361065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmn4q\" (UniqueName: \"kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q\") pod \"auto-csr-approver-29561926-ttwt7\" (UID: \"bf33e85c-0b73-443f-a5b7-36ce4922d62a\") " pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.387355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmn4q\" (UniqueName: \"kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q\") pod \"auto-csr-approver-29561926-ttwt7\" (UID: \"bf33e85c-0b73-443f-a5b7-36ce4922d62a\") " pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:00 crc kubenswrapper[4755]: I0317 02:46:00.482636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:01 crc kubenswrapper[4755]: I0317 02:46:01.047133 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29561926-ttwt7"] Mar 17 02:46:01 crc kubenswrapper[4755]: W0317 02:46:01.061178 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf33e85c_0b73_443f_a5b7_36ce4922d62a.slice/crio-a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8 WatchSource:0}: Error finding container a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8: Status 404 returned error can't find the container with id a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8 Mar 17 02:46:01 crc kubenswrapper[4755]: I0317 02:46:01.178622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" event={"ID":"bf33e85c-0b73-443f-a5b7-36ce4922d62a","Type":"ContainerStarted","Data":"a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8"} Mar 17 02:46:03 crc kubenswrapper[4755]: I0317 02:46:03.223067 4755 generic.go:334] "Generic (PLEG): container finished" podID="bf33e85c-0b73-443f-a5b7-36ce4922d62a" containerID="8e50752ad887d2e464391760a718e56f2beaf2599303c6e0cafc99ec6effaf6a" exitCode=0 Mar 17 02:46:03 crc kubenswrapper[4755]: I0317 02:46:03.223208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" event={"ID":"bf33e85c-0b73-443f-a5b7-36ce4922d62a","Type":"ContainerDied","Data":"8e50752ad887d2e464391760a718e56f2beaf2599303c6e0cafc99ec6effaf6a"} Mar 17 02:46:04 crc kubenswrapper[4755]: I0317 02:46:04.692895 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:04 crc kubenswrapper[4755]: I0317 02:46:04.773744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmn4q\" (UniqueName: \"kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q\") pod \"bf33e85c-0b73-443f-a5b7-36ce4922d62a\" (UID: \"bf33e85c-0b73-443f-a5b7-36ce4922d62a\") " Mar 17 02:46:04 crc kubenswrapper[4755]: I0317 02:46:04.784108 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q" (OuterVolumeSpecName: "kube-api-access-mmn4q") pod "bf33e85c-0b73-443f-a5b7-36ce4922d62a" (UID: "bf33e85c-0b73-443f-a5b7-36ce4922d62a"). InnerVolumeSpecName "kube-api-access-mmn4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 02:46:04 crc kubenswrapper[4755]: I0317 02:46:04.877405 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmn4q\" (UniqueName: \"kubernetes.io/projected/bf33e85c-0b73-443f-a5b7-36ce4922d62a-kube-api-access-mmn4q\") on node \"crc\" DevicePath \"\"" Mar 17 02:46:05 crc kubenswrapper[4755]: I0317 02:46:05.250112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" event={"ID":"bf33e85c-0b73-443f-a5b7-36ce4922d62a","Type":"ContainerDied","Data":"a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8"} Mar 17 02:46:05 crc kubenswrapper[4755]: I0317 02:46:05.250454 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a219a542e90ced1a142da86b6d4ad31cf09dd76302de14af7062e29081f3dab8" Mar 17 02:46:05 crc kubenswrapper[4755]: I0317 02:46:05.250202 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29561926-ttwt7" Mar 17 02:46:05 crc kubenswrapper[4755]: I0317 02:46:05.798244 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-r2bcw"] Mar 17 02:46:05 crc kubenswrapper[4755]: I0317 02:46:05.815084 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29561920-r2bcw"] Mar 17 02:46:06 crc kubenswrapper[4755]: I0317 02:46:06.271949 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74e61c4-d0a5-4394-a8c7-c2f68174056f" path="/var/lib/kubelet/pods/c74e61c4-d0a5-4394-a8c7-c2f68174056f/volumes" Mar 17 02:46:23 crc kubenswrapper[4755]: I0317 02:46:23.525857 4755 scope.go:117] "RemoveContainer" containerID="7f47ea6e82d9e727aafd24f95611fb487215f3bcdfef25135b8cee21509b55aa"